var/home/core/zuul-output/0000755000175000017500000000000015155644247014542 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015155651130015473 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000257542615155651053020301 0ustar corecore+Rikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIsdr.k9GffHͅR~I_翪|mvſFެxۻf+ovpZjl!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguݬvGR)$DD D~m{]iX\|U. $ॄKЗ/83Jp ώI8&xėv=E|;F}Zl8T*v (6pk**+ Le*gUWi [ӊg*XCF*Adv cXk?`QlrTvb)E,s)Wɀ;$#LcdHM)&c(ޘzkQ5 @:}ɏ "'P8[aN|`%*'9qvD]9X&;cɻs0I٘]_fy tt('/V/TB/ap+3WO-W"wC1qE.HK`}o9O\4\rJrLN^.w[A9/vb֜}>| TXNrdTs>RDPhXek-*듌D;5T jz_Ydf"_WٍfAG%ʷr<>; 2W>h?y|(G>ClsXT(VIx$(J:&~Þ1ז/5άRIoCMMQQ؏*ΧL ߁NPi?lXwh@'ڷMVwooe:[t.ٗ*AqΝ6 lv?Qw):yt٘8c'8Ai؋%\wqtƅ": C@!P q]G0$¿1}Uhs;H?)M"뛲@.Cs*H _0:]uw`@}-{C):fUrg C;Fgɣ@,r<Ѥ߯R> 9d,0SSN#s ayiBq)u%'4 yܵ yW0̿2ZҘ[a-0V&2D[dwl*?%|L pSROޔ8'ef\>UxR ks J)'u4iLaN `eH&MJ!&ᙢ(<<-ja0Tazkm{ GYә7}U>>a>ҞTG{P`Eq 6~;GqE,[pJ82D:BCtka7v Ө⸇N~AE6xdy<"r/0;|!B`0p1y6 PM3rr1TZ')*R ,k4΢2KkBxTZ5$?"CgXYFi(ް)RX7?T4}, l s}ґ4nev$\ ʀu*,L(EĈhz[}&+pzkO0%X8Ua0NET݃jYAT`Mf8t݃&(sj 44XA E#EPV_XWWUbU5ì"M|܊W7|}N;6od y$P'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#j6eKߤ۾%u LOL8fq &ʤdH0Ζ@.ʡP=_򩶌Nۈ?/6LԏӾ S]/M<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~*ſ/,7.w47mnjGgG{9_e552s4I#~>W<*e'&Ж0(ݕ`{az^su/x)W>OK(BSsǽҰ%>kh5nIYk'LVc(a<1m}C\G=!"ܚ:^ hLTa90뼯nNNXYtߎ% qƦat:D=uNvE=-ʹ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B=aK`3CmF1K>y5u=kkN2;{#^阊+md^6%rd9#_v2:Y`&US tDkQ;~EC#{usSMiI S/jﴍ8#~hWC P2EU:F4!ʢlQHZ9E CBU)Y(G~)j 1駭~բ>H؃I'AewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`mwƂ%k1羨(zv:U!2`cV, lNdV5m$/KFSw0|P5q @3ǟ6 mR!c'¨h=z,lzv7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#v0 %L+`8zO5-=nﴍ8Y#~pkѽ)'cL@i]<ք6ym®Yi&s`d9'vV,./Ӱ#h k:U7Uv7чd[j^-%[ R'l}jdX*kj1H`z8F5]We߷CJ0TTƩ0RxSe;>/ ђ(9Uq EmFjXq1jX]DןR22arH_HI\:U}UE$J @ٚeZE0(8ŋ ϓ{B.|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI>QQ!'7h,sF\jzP\7:Q\)#s{p'ɂN$r;fVkv߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~_>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘ3ڴ(|e@ew>w3C=9k-{p>րd^T@eFZ#WWwYzK uK r؛6V L)auS6=`#(TO֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[klzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IX/C7@u BGڔE7M/k $q^hڧ};naU%~X!^C5Aw͢.@d!@dU}b? -ʏw |VvlK۴ymkiK% 0OFjT_kPW1mk%?\@R>XCl}b ,8; :.b9m]XaINE`!6uOhUuta^xN@˭d- T5 $4ذ:[a>֋&"_ }Oõϸ~rj uw\h~M il[ 2pCaOok.X0C?~[:^Pr򣏷y@/ڠ --i!M5mjozEƨ||Yt,=d#uЇ  l]չoݴmqV".lCqBѷ /![auPmpnEjus]2{2#b'$?T3{k>h+@]*pp桸]%nĴFԨlu |VXnq#r:kg_Q1,MNi˰ 7#`VCpᇽmpM+tWuk0 q /} 5 ¶]fXEj@5JcU_b@JS`wYmJ gEk2'0/> unKs^C6B WEt7M'#|kf1:X l]ABC {kanW{ 6 g`_w\|8Fjȡstuf%Plx3E#zmxfU S^ 3_`wRY}@ŹBz²?mК/mm}m"Gy4dl\)cb<>O0BďJrDd\TDFMEr~q#i}$y3.*j) qQa% |`bEۈ8S 95JͩA3SX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvot4~6I@GNݖ-m[d<-l9fbn,'eO2sٟ+AWzw A<4 }w"*mj8{ P&Y#ErwHhL2cPr Wҭюky7aXt?2 'so fnHXx1o@0TmBLi0lhѦ* _9[3L`I,|J @xS}NEij]Qexx*lJF#+L@-ՑQz֬]")JC])"K{v@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݻ<̍8)r`F!Woc0Xq0 R' eQ&Aѣzvw=e&".awfShWjÅD0JkBh]s9Ą|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧ"0(HKkD4<80: M:'֥P!r "Lӓݰ@ 9n# " $fGgKQӦ4}Gn\^=-Y5PI dPN6 Ozځ/פ|5) F[ڣ$2*%&h v%9HN H~Q+oi?&۳)-nqK?2ސv/3,9ҮT9Cef˝49i.2DxatC<8iR/ƬйR֌vN8J"iJ. T>)qaY4ͬlyg "]BvW#99`TegõII kюHLa^c&/H^FFIu`2a$mc Ry+R:LڕDܓ>Y:]t.+|PT6=qWe0NƏw<6o3mv8k vGOfpEOkÈWȤMف lOc;SR&.w,qk>MPs+Xh4iyuGRd֞q鮺]m S{}]U kV0/ŜxtADx"Xh4|;XSxߵă@pE:y]/"(MCG`ʶϊGi+39#gNZYE:Qw9muB`9`LDhs4Ǩ9S`EkM{zB<˙ik; JD;;3!4 2Y.$Dwiu|+lO:k$]ԜYLUҞ6EmH>azʳ/A+ԀZk"f`.,ל{=wh|_qYj5M{K$g}yV '9l%w:9^1ee-EKQ'<1=iUNiAp(-I*#iq&CpB.$lٴާt!jU_L~Tb_,֪r>8P_䅱lw1ù=LAЦz38ckʖYz ~kQRL Q rGQ/ȆMC)vg1Xa!&'0Dp\~^=7jv "8O AfI; P|ޓܜ 8qܦzl5tw@,Mڴg$%82h7էoaz32h>`XT>%)pQ}Tgĸ6Coɲ=8f`KݜȆqDDbZ:B#O^?tNGw\Q.pPO @:Cg9dTcxRk&%])ў}VLN]Nbjgg`d]LGϸ.yҵUCL(us6*>B 2K^ sBciۨvtl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ綝R(mtV3rșjmjJItHڒz>6nOj5~IJ|~!yKڮ2 h 3x}~ے4WYr9Ts] AA$ұ}21;qbUwRK #}u'tLi'^Y&,mCM)eu㠥Ѻ\a}1:V1zMzT}R,IA e<%!vĉq|?mtB|A ?dXuWLGml?*uTC̶V`FVY>ECmDnG+UaKtȃbeb筃kݴO~f^⊈ 8MK?:mM;ߵoz+O~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󹞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸȯ.vIH"ŐR ;@~y>Kv{) 9AG ćͩ$.!б~N8i"1KФ\L7/,U@.ڮO?mُa ې!rGHw@56DǑq LA!&mYJ*ixz2*{_;IYJXFfQ* 0kA".mݡ"3`Rd1_u6d逖`7xGMf}k/⨼0Κ_pLq7k!dT x삖A7 u/~&ӄMu.<|yi I?@)XJ7{ޱ?Q]{#\4ZfR-dVaz./f+yGNMGOK?2_~3\z=y}^G$*A! IcuR.o=MZ9zu b#s9@*иrI@*qQN||Ix;I}&ݢ6ɢ}{]x}_o>Mm8S]~(EX{wml"Ms>\΋"?|NKfֱn !d[s׭dֲcUh=Ɩ9b&2} -/f;M.~dhÓ5¨LIa6PnzɗBQiG'CXt!*<0U-(qc;}*CiKe@p&Em&x!i6ٱ˭K& FCfJ9%ٕQ·BD-]R1#]TROr}S [;Zcq6xMY 6seAU9c>Xf~TTX)QӅtӚe~=WtX-sJb?U'3X7J4l+Cj%LPFxŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ< &}BIrwZ\"t%>6ES5oaPqobb,v 2w s1,jX4W->L!NUy*Gݓ KmmlTbc[O`uxOp  |T!|ik3cL_ AvG i\fs$<;uI\XAV{ˍlJsŅjЙNhwfG8>Vڇg18 O3E*dt:|X`Z)|z&V*"9U_R=Wd<)tc(߯)Y]g5>.1C( .K3g&_P9&`|8|Ldl?6o AMҪ1EzyNAtRuxyn\]q_ߍ&zk.)Eu{_rjuWݚ;*6mMq!R{QWR=oVbmyanUn.Uqsy.?W8 r[zW*8nؿ[;vmcoW]"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ2NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"roe5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP) 0v3-)-ٕAg"pZ: "ka+n!e߮lɹL V3Os\ဝ+A= 2䣔AzG\ ` \vc"Kj61O Px"3Pc /' PW*3GX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY"Fbi8D'+_1&1 7U^k6v읨gQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZtO 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zF"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z|VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLnIXA6L7 8UgKdT)*7>p{Pgi-b)>U6IXabPde Ӽ8Ģ8GɄnb'G ֤Mcv4?>HC78NE@UMc8>`TvZ:}O ?m$_Pbw&N&clw2nJ+|[MRnZ$,v=sX]]U\8QwDe\y@dY[ էhuF/i~_͠Z4~_m ARĕȲ(Jޕo1@j(L\BI~-aԼ 7ͭ$2lȾL[Z58)D@{o=Ń:WXFM~ N{k+b_*,H~j ͜׵4QY L${cB\A_:QG"'y)33S1ȴ{iq&i_9Q_9yhqe.).)?u~ĸ?7铐sǣ$t\+g/C3dK*sr("z?Tw;ox~K*fiE ; SvK'jrq|Wd)m dIL!LiF;gPİM7 p6vLKJcF>25q %n?"z&W*IfEgB5M=åt> k"ɕ5+3?OLoL7^N,%7+ [8i:~UcՍ8I0ƻW'PT\MpOf[sxbj(V "[( w{7e B)UޔḒ|̎:/>% okM-_EwXRuhZ2ُ3YoO3MٞT;tޕySeGlOUI ֡xË@QA||QWGEmV(y{}(l>yɛA"(qR̓SUezPNC?~n0uloegwDeV(B."Ar\o1bo?_/Tmr0!~>cp<gup}<| (B7$}ԿOΖ bxg1d ‰MshØ:DfQbTr>p{{|[^~x8'P\t*·\ !ydgIވWrV2퍷*Ü>B-\ CJK~i#S\ O bo3a7ńcDsʶO^H$CGat<9N"I"VO'*W'/(y@jo;:}V9 JUIH{uMͳ=gMD "}}$ c}ePT|gjs%OBD8dy^wefW_S qSbyQ$ Džl& ٱ cLUX0,5?fod3^ ԓ\;\ !v8եq:Gf{Q%T Toxu}:\r2* #I^kdY!Ct\HZ>ɫX0G"N޷ۺIS1Oh)fhnyDyUV4?pTu[Za <уG|iuk[+C36?<,55 忪1}ˑr?8HV@eGi^ӈ;gcrNB&&zqKC3X'8ϥZlVSO޶cn}MՖdXpFjl7ZsclוX&_WQ[DqQ[C sh]떿gtVrԲLsUH>3[sYg͍UoGf-3[f|>9ykCtСݕ^{e'u]Wƻ:6ٕ߉ ]{#ua;ףbߑpgۖ7ZBQw+ Y JuF}ogRћ vຨog L܉VB$gjLĄ1R#f"x?131L{'2Aw'Su`HaRu~HE힨BbӒo]UpKķ"3}3@A-Ɓ?ݠX42)벌PoV&G[VŢPbA$b0t!ގ4fĵn ?yE48!Vwkmpݽ-Qԃf4hLn2.$'ؒd=dq\'T&m-Qw99:p6=OgE&~AMTq!ƹP#kxu=TҤMd;-@Q|\Yz#xe:` ڔmi!* ,tZW֚cJ^c$>ymuEͧ, SḊ7,WIu8~B1=x;OkTRTWnl3u~a?Y kN#[YA2-[Sũt*ʤx_TXO>ሶ9rރ3Y-Rkr+aYe; =E =0n}!Fڤ"ko\YCxŶM6S\ o?=7ON?_O_tpˣȿ51ZBQP4]A! FWTaiKl,G?CX4&rUA#;`NÈarzBH:`";?e(>FFdޞ]G 5ݗ# AQ԰t lpx5^)=F=1@"'\ח`F'Kn0Bzx^;R!NغĶ{FBPz1?Q<7=1!n0\SFbٶre= ک.b>ţADڳ0h^E^$)}c OHrbBIU7@M*>#A)y ; Ԥʈ L)VCL 8]I`OoTݾ[u`PT%Y䰵}(\(ByBK8Ճϔw) [g@da3aô "^̿l >df 5Rs){P0z љ6tzHZ+2Sj&D+F0EU y*5RUv6]:4* U_rXU%˕< zI#c?$$YrM`CXIH`,}y`,(YD>b12SԴG5XjKgl~/3(̲|sRL0*gx|6.yFlsTY4\Ȣ1o#S B+uY_e_~6)YDsbCˤCZ/r xD e?|9 &~̅grL8GQ<Ãmݸ.mKKw.+}ۂ;j.4AG j (k v.({쑂uڂ:;.4AG APwwAݧ >RPwAz =MPz;w]PiAP (]P4A#]% "7Kx Y"57rz*}|gF ]*VtyY~gd&zpJ˼@iH:]\.r4|1D I1IQHPT=$*+]@~Aϴ -64B)?g=wo.4KepA^"-.͖7(0rUBDg`W ypr\9 e޻p ~qEUD(rNUBK.8%? yG#LTVC[~ OeIH1tкRT7Am-y-hqjڨnr|]") 8}M8٠wUk>,Rh7RA9/Iˢ-Tl2*j{/RE$52Ko <4b [4=TVI\v~z4 NP2`jó#a /гPSg9 t)m`#xǗ9eWؔ?1F@8\Q FuŝTV7geS%Ҙi) 9>-F!9|r'i CkSJ=,J$MXZ՘Ԅ5WJ8//<"I{m_W-ˡ#Cq$լ,B HD X1s- q|W 4I q/R4oF%dTiA@>"87F$DAaD"0·wyLyHcfœċ ƺ0 Cl fU};ʮ6,6$ƍ_,pk"7k?"i{*ޠ>7pS{ǰQ\h{g"@T~wy*F,A1owQm=sQ . CdG֞xϋY8r`;K4)hOF1W,&/ʫbd4ܫ餲ft&(j.m|LbW2ee]EeWz>; kwR%IJM%RjEV*W]tc"2),ӧtB׶&j.M)M;jpV*ϫ3{LNN,x,_e9daq#+|R6ٜ&EJz#lh l$AUx%p(J s335?CBr*k?,>% Z{ig-!4ݺr~dɷi/^B#-FopI-V?zwb皘_ C4o-ly|B=Z"QufJKAe.D 2gQs&W(b ;u˔BV,R)9-s!N &{ DQ/.b.y {uڸM1#6r;Cg105]/J03jI} =R[@m?4n迾ݢh ޥBжYP)ٔ[5hUJ{6Y|{p[w>5cvT`<,KR+)}/7J( %|7MPh bWEH-K)W-׎*|H*7\J:kM6{ȱ5⪍ߵ.Iās}EYQ5o%6lx2R$s-Q΍lo84 9. 5I'8#J8ѦPS"&: faQ)I'8~_>r(\7Qc Z2 V<U4)NpG-Tw@ #8"o VMR% yE S0 /L҅"DBtoWt *O?~d/L&*L @3J)19PSl!E hn"m6s/V@eXF7`M*^sրNcuec.L qQD0vS%WHUC>so_y }IJ~9p ٿ)-3z<(+<Ǒ@+PƶE(1ppn_]R c(3~*5H*ڈIJ~8Vۺ{Qih 猞T(> Ftr7rI\ vRcLpڳFߞdeQ չ5?1Z18?Atc}cQm1k0 ̗vbĽ{ g TMnRG%Xx([0߀MԪ/u A s rS&yRiGZp<FMx(lp̡(7v#*!'ϤK`Neԝȳ%]rc*& #-9,Dԭ5Ԝ1QUS (SNVHqޟawdJqdnA1NYTb Z( CVTBVoQ̍WL%7z 0ZV*bՙ 1t5kl-*)>c+{x=Z!Tn”Eo75[2 A,f$"TJL.c,AX}VtVB-|3<{!fN!$1a@OiUl$TP$12zU!ZIIwjIA6U*]`J'8 7ğkW]#xz!`8-aUW(j0En(/+KtC{t(|Cy_/QMN]K·400mԼW9A * i19aTĸUq/ʠ _SM&S>y-ֵN,B=)I3N e|n tyUZMNUO,4P$x{{(,871]~#ds\L"KOCBh](mI'W,8N.$jʘFh .&؆A A s_ݗȧ;UTbX4jS<+ O Y5_Yp u+Yp jST_VW2O({f xW³8J\Uq[G7nuBYowwyq󴥒'8q]g !2Z_ؗ[|ꡐmfhG3fr)xO׵QrJw:7c=]̹aΚWs-&w$c[>Kzxbh)T0(+6 vD7 "kTGUMdX#H xE #JME7tfѳhFJr/`4+9(X/RHJ%7Źb..t΂ds넫?T/, ^ o?Ňx,7PmwC2Rk@;75g̙;oYp=c=Rܜ؍CI9Wޚ3>?Xpsql9cƘיqe$!D;߭*T(b#i3yŹ,ؑ(`s<x7Qcњ1 =F񡈁Y-cyn}ɽ>_G3&@^N~PDfVwi騵&LG|,ٹ0ib0L F\3 WR~ַ(l6HMap8{nZ !gdS~DG5QC4#%[67-d6ȷRFfs0r̦ eoC坽he|5Vo NTF(c*?yL=Q !+S%92sf6-Ghxbm6mݘM!gU^XYӣHjy dt IbPW.k蹱W]ϖ$V[3]S_/eR~K|:C_=q݈O _W0<7_ ẇ |QP-^Iث ` "2'peYH;IDK8 \-'WͿLf8G9Mj(?/c׌(| d .Ew3|앳נo&ɕynar&/VLoruMKxShp||m݇tn?o0 JuTїb*qn9tE$;|zӞ)n=޹LFeG%GyVvc_(2>~koǕ~19}6d8[ N7`$;J:ˆ)z w޼<\'1ͭ@Ad~\ɒUab ^=R4o߁{  +)eT^ oF+P_&A4NLJ;T+VXXsU73󯚙y !Y[ om+-exNcw=AlȄz/m'hBˇcb^p>N{K0335Cr쁹cgğ[hX*.0M"1@w̺4ND*Fr*k$HYw۝r2|=yuط%tc/@'gn޺ po21=@ЩٛɀgV+\2B T_1I㪍bq*kb~mreIGӞ_.|]]hT|o\s>DߵwZcccj#b/:IeO:lw+O~ GUOg&ՉsǏ|・/ "cnKnΪAO_Wss| wxY]9I-9ylM  ݖP+ ˻dȇ8QA;~ 1qojG;8ݗз!G0#)!Í5ƇNWț82i<v9Y'aM:֢=:Omyq mh%&Ox <#i"-yV]6kF$ ZH Wmf7y&`zhvUQ2D / k4PNUr[$ JA1D76uqp .=lQժ=z#Ԯp#ަp#Z$ vIM@$i40gHAV :U4SoW:WS[`| Zh3KE9)E/e&W)ABʁ$Fƺaˉqxr8`T6 `L7zQ3XUta>‰OOE,ù.12ҒݴFݴK8:p%cQuNi&I3Qeee]2Uf. \ pJ,vL5钡UrwX2rNsWpb\4tQWG{ TЁ]XC9x <7Մ1\ya&4;@4Vj /~ՎI8I0޺Wۄ~,bcL^aW]ˑG7__ n0\m]o^ćg,2r=Ľ^R#V!o^ 𜛧bΖ|?4[Waaܷ>"BR\gˮBt^-YUgx#TrAG_[Wb_buI>o@&;kGt\r7YArϊMYu~|[KwZlyȄ"ڿ2*lԫ/n2QUL1L\>{q8{c;rTgg( ™s"N:1I#%a!S9drjV_q1`'}gQ(ѽ",jɥ-΢P E4w g]e"u*M4e9fLT*BNbahsa9K&})*y&MhY궥N'%ݦ-mZbݦQ|lcݶf^qی DTs:ՔIɘ8G)dr?sE|I *ݦeK1A?SawLhj Uk@b1-Mѭʄn4PjDl-6{ayv(M^'悳\ͩImLFIJN7-)QpКl68v\7c#h)kXi>F[46Xm/Qİ`Ɛ\9ɐI Ϥwd0S7 sT:Jhw,(7ņ4Va;68BH BP7 FIHL.OIRn3眢!VWP %Hni[a8:LB~4#)KsA(K: ө%6T..yv/լM!D xk 'g5-Wpё߯. }Gbj*41){΍)ybmir3MCG j b|at?xBцd Y}{,a X%J~4ǠO7__g0߹[{r7v>d]ɃW]w~V#t^(tˡ{ =/nAFzt [u/``#t3.'?@G]ЕxZjHGO`ȩ趋'T{@Or  rn4 @0n |^ cl~q㩁YAMɣy}G|I KM!Ȑe(uR_إ48WDBgsΜ͹ә4݋DVOv× 49&vgRb4 ЍidTaY!3CS aX 2 FKBMC(i7|L@F'NFjA]o8W4^۽W]͢Di4svҽʎ#NLFv(Z&3̐T$"N0||NU΀1jM\PHF!+m#Q!J48uN,,-hg8˨d,hQBsરYXе|%]I9j*FDɔD#cgETҔC)) A0Fי2Bh%)h rE,Đ&&R)MMŰBtPֵ;&R` - )ճa cn]&/32Ie:ESkST,E˼;{TFԨfaI8ϡ%:n:ym&K V2L% =A*#EnXRtV$# QUn(Ch!D]r' =nɈ[Akޥ٠ lЊKhRvɄ9;& *ҡI ewuLϚx!i|Xҳm:e vO^%sKG+BTge(ƴxQQ&f\eV(iPj@jN2d)7bu">^vL#SMmWǭ BT1R(/Tjsjm^@n"6*cjKSnH3i9<iZڴ$9 _hґԁHNFv))i'`.x sy⸘ >LA2`уrvZ!gtzN-ݙ iݩ`dɶx6fm&;uv=qF_]¾:TR`DX>!0~f* .UqGZC%>jNZQ5o"t0t*ܔ3JYY2st2VJwx$K<XI )FхY!KZE() Z䂲4E52i~Dg:dLBu1fqyӰkYaHM.,YfBKfB'ZVoPh t,IJH(@M)>!E)%e'S,r9W&52UZ1BRI@t9<[- `A;;3!07GlP\l;A\a%.I#1Ol1=lt @ N% lYʴMUƃk b3mUӅ))3gĬƥA2*UkLE63&%0 1L(!].%Dn͆-~`&4}U0wI"G-(E4S[0+frC,(1֛GH[=դ+ ҥ`Ɲ 2H oԭ(h[0]tR&x-$7jL]YLNy Dn:Sݕ:m\Z uVlDgV¥'P >9'@Fj׆hJ2ΟGqLuʦswTs6f_!dV m5\Q疧cOEy8c|4Y:|hq]xZ./.vC %-&{ڤllZ׭( o΂s\AG8_~WG*VaE-W-W6>,kCH)w''|nڄVv^\/%~8~s=z֢,X-kr Lw?F #r8~\(gSy 5yr1.EBɋKtF gӗ#A?uV::g:rQ+w3E]// 7p}OA?ꤩu6x*ƾڢۼ$wjb [Z>rjn͎ݱ bv>[6LZ?8vkn>h#AkJlݧK`Lk*eό>)X8۟8'kOf+%z]֭?lӓv%fg8MvxMĭ'ſp`ITY' K~xhUV'/ Tz1*:}QW{ݜmv(gNJQ D!IN҂BbexDh5;jbHk94ςtCÓ^17JԨ+4kS?8Vw5׈KzzbW;r8 FrIK;{Sc#`] 3oQ+C֥; qd-vG^G^a߿3\Uٻ* >ȹo|`o;W& on"뷽OsP8$;8%"zk=g||~}6tu(Oӓ/:V|֤HWt(\:9 F8oMt<ɉ0s4pZP蛻|7Nj6.uR/usr]8Kq\NPEqxQ7&sz`2ŅpP}Ѵzߒuv ZW7JRRI24I~RI_߿Jo>͟o{ϿOv_onZ=CF#ĶV7~vy~rk 7|ՙQe,Qw{wSFPfj|vr4qvTLnbtWC8#&ÿT8sct;=Pxt{A:)#վI'~Wzؽ.x9G$3x+ky}xPR>y{Ӭ{{|TIf)pM\=sz:svsUMtUH^%.ޯj~j{͎2&ÁiMQd#4i14qԥXNgE\rq!G#==nQkQKگgϟ?~=͝SiK徾? V[ur47.fyΉ͛aGyIT1K:9_c}\ >n6b!~5N՛4 elmGh"M'~n/CeU@&z7PGeK+t\H}mE*|g/bmfALH1&2<Dj:h0p[0FfD C6o%8]B" `rrOWj[/]nL KJm.baG?(:mK] X S[;(-wfBe}(i]T(>,{D4cz5fOD&rxcAU|+h[𬔂R՚4GX-d,W\ɝ7rNۺ>4gD[p;\M4WK,PMF!?BL)}{ɨjAh˜WufEAM2.gG*yzW4{vEʩm*!5VLe5Q4-B3.RJЙr.%E6;m~w=uuG<Ȋ6Rj萸? 1M`H|rr_6jkw&0:dWѹ\x%ׅpu|`)|ܴ׶`۽FqƋVaQCVڡp&9nj6K.BF٬$ +LA /MV0n3-W/QLAJL j*f|TMfS%:Y@m2A=Kܨ㦂aNXY0Lԧt(ԷVm&VUkQUGn{?*?n3BHtLD ,Id@4r4V05q9wag%er8T7a|Գr. !{ P}]CJ")Ih64&DDJk5$nMZ$kU*ڥI;V9.LZzQW,4홉BC}hܑծ75 w,t]@ȠV]444G5yEȍp W\\E a OΌ xUnsKpvD4[fz~6B4FӼkF+M ׅ4q]=n20q r0h5ք(+À~= 'uzA>4Bc OOT-n( h*2 Jg4i"E!!0I3; X($QBj ZQHO@'$=aZ %*7Z©ʴ&REqf}+DT5WD.AE ~Iђ0\lp.3 ى( kncG_Q|/@wr]'uNVښO[7T{j@YC N| bs8=NM΢qqGקGZ_>~yYcHhK?c_i"H\'/{c3zHޫptSݦt߻p}mjA pt>1r+-ڻF:Q#g4}?Oywx:sxt/5a_H轠azc۝v:`k;phglށF<̱bbw<ٛ<Ÿ)4L DqN2z y~|= ħ1nn9;t`=q0wo逽w6.;{_ѮsRbXYƶLJ;\S(.-1/~}G˒¿~h3`d/K;DMaѷX#gY֋Xqim~>~\ gĪV/ הkp.*sDKe YlK$ýǓmAKXTykMdB|$,xl5Grzu#UIγ*)q"H6r{%;ec|Lb H<$q"H6b C..6ι|rM>?e[R+xjnb5{O`C5&F:e U۽jY-IeI*_je Oy*d U^} v.SjI6)TzCdxDl(L9iLC% =$4ԊQf$RAXYܤb{Ft0Ta/{ F;&Ӡ&'G.ރ*Rn^t2[l^R%b==HTAT.CΏȔd*~Ib{ZpT,g$R Rlok{ ʠ_NG98*Sԃqb{ 7&R`HWloj{CpR /ՠ[l!^pT1' T{=@ bh'u|>"[l^֙2<(p- m/z&nL:ԋLb{b0FVr"S'+[[l^{@FVr× ,Mu FFVrPꏟ @jz?ʞW@V5_/_Nu঺yYo9z{;O\+<{oKֻX۶"jm 5ܲ^VIݾĊuל_?oEw{wN/Ԟ~[~͑?MOv\~wg4į? |jlC;O〜]ף[tzw tcg51\UYuw qrdؗ/O,a#gH8$e)7悫-3$j|7bն2VuO-BOԶ;By֖wa& zOdEG{5K+x! VپӺ@KtW#:`\¦0Zhvj]XW=T/}%mZRnԠR o]_5-ٜ%pU{/텐c(ؓ@5$P`i_SuqeHKxy<%b@B4*[FC{DٮiC }CX힑h:'R]*՞ugpPfIqPe:H6Tv ֙Ϥm1=Odz}lb#С0:0e)hz p(5b% mM7uU*IsVVv[lh3pDyh <*R8$'*rrˠsެ3Mg 4Y$KⳆے5UI|^Ls`yošCѲG`bd#"]AJד&;`5! 9_J%ZC]Js z[ݬRߩd `dyHdl8?Xzn*D 48x6{Y[I#T%BJX8ݗˈ u~wW__sTC p|28׫R8[Ūyqv~u}f'|nY1K>χvɕNרr~䉊7f8(r=8o~sҍ饭OBHPF6T$HR;MkMvQֆ]в礉EHtdHlŸX瓰?$c(q%%u)sIo84Fg.6wD$Pc[B)%DoZճ%zlŘk$!OGȚՖIuqse*teRФ VGh)IAuSPnB !,X Ώb:H6PY.5k5R19oMszhkgRh;3+ǝ{WћLa,髶q5˒iWX,y0$Ɇ%maK/g K޼nnO?cjU\V&_ d,{T9r_5RWC@"k S_U@VYj{%P q(7$9<%zrmWr{ܫ^7_#gd $ɇ/y _& ]˾' fK0lj=U)| _|0e/oP2_nJ=l E~&Bl$MɆvfT_3Fs4ƦүkX*Ut\պ}Z7=O !F;bI 2t*¶e:өeTT2WP'Cd`@j:@0woR(t@zkkr ]zϴPٮš>v ~gdHQ05$Lqeݩt}8 i "فBY%Rd#s6 Ӷ5]gqg]3Λ3+ ݦޕ`N8f\Z|J PWRHavm.7~m1& A)ms]Í1Uu wnhql?6Z6 A@cVn 4^ 7Zg6i%oV$ yN@5$h)% `ZCI"uw~W <Jd|DVFnB_ @t_8}\#rk1ˠ޽1-H6ܲ;SʭUS­Ys bS1 1 Q)猗sZS3O9g}gQ<ԿC6!A8AW\:zۅ Ņ޲O V|4!!AA BVYO%Ǝ9\tM`].Yj..4l,ō% UAA+3[Rii}Lؑ!7ܤ%Ad2H6q!Qoz+;{3z9@bD`#5L@$PuMkJ.x+Pޯ2?(fnɫY2k$P*E:ؔ _?C&3@\T,4w$Fֺyom0("" rbgTy#Oye:F>WMS;V7nn/Jqa}KFFRl"r8Z3wP0%-!d dWN '9!H>B G2pAԁR<$! ?b!K%AB?bZOd\"#o٠ ^pfCA3q`'()H6xwKB߇PӜ,fOq݁#D։H FJ*H|/Y{?6>z!Ɓ$o0$"oSY`Ilyxv  A!nN$$I#thJ:gNI{+FǠ yIA4ctּ׮L_!Bcw@@;"1zp]GBݡG ,ɇ+;[Uƕ~yVC32I$ Z,G, E[ܻ?4ɦ老AOFu*bf_(b䒠2$󁣅6#%B%Cʢw;8l܇3=pd#jW\h۾HyLn):'b*$4:S A򑲰|EM?;O0uUfk\ݧ'HQ :/'ݞ|z{{oVb-Yg~~U_I{^]6'c12eݯٻ6%Wb?$ KCb Fv}$v==aIԒ }OP|ӜHQ"SnO?cLԧ#{Zk6cn,aqRP77Ro#VvOy#\β8:I&@,B9@ &+o\$ߌ:c<ÛFS^tX^ik"a؂LnJkV_]f[7Bb잠$][ W}Ky>, m^ JYqx ï4o/y0~AgT=Bqg_.a \Ͽ^uf Y~ >2tr(k+ ?HлV:wOQ翚W1Uv̉Ւ:cAiFЮWa&c>DRh-ZezOU3 (M+/R9lxճn Ҕ N$%/0Hycȟ4aZqQ W&FH*6@+[^^Ip{>0ذ2͎(ʟ<?37iiP;injuHcP^8R+ƶ&Syx;X:BCԵ R U^f)p0 |oJ{uLu?kzREM4׬P޺\BX%e1?E= F F}؁faV[x-=6F1mM )YF=imB@4J%csD_ۼRv>XɨK]RI5'0HRH8|'ч/ ,g\| >*}PPe"ncFa3e`v];îlg=v<•bV.ÉTJ: s e o" ogص3v,5.eԄ`H#4R函7{W wF*32]_՜E~x o?2^Ӆ+o ֔O{WuQSu1QScg XɚL~[Ԙ:`?H ީz3}PKIҝV$ݧw-5gz3 0 OߠT;5[.~b7<Ҁ!p1kҩ~llY;V g^uZE#x멦*E- m̡M/TM+VZKTc*PHTe(f;y< 1O)!XfzW.;%%5Et6~ŭx$\TJEzWgw񺫇۟ZJ1^ړ!UQt pYd7qM0󓴍?JIL/ݟZ8NK{dHtו3?/7e ;^%0r~ ~rwvfu*#xu6[ ~eZs{4RIksp>(Fǘ*s1+@VǷmTۼJIB썢!}7AQ9q)݇MK}иYBͨsnB˦/tMH1{3y÷ a 2N]8B z(A[RE3lw*NJkfRkB^30Nzc&oW:+&1tۼ plw@l2%Mfc91K_wɽv^i갱x'd5zq@cc'?YwhEoTNI4SrnI4XPvL3t$K+o4mjXg VE:jyp+GiahHɟl%z21D PNhUPFޖ Pg%=A3bbBNbpN#+K7G*yQ+ֶ@I[8LQU hvQ)[7׊@WyI,JXpRKIs$"xGZR*+--#-!&!{fx-%묊R؎.rL; AA׌CuZi+w),Q!h f$p/,zQ@`Sp. I)SE  EϽF:DEe(Z`=%\8ЅV"\X+,uvYJhB#H:DDPqa8xY*VHRRX G$EcEFG#@kE?vH&,zf`DE ,*UBR%^f2 N})@i÷J­:aሙ j!J-ӞFA V7"\x@skLvk 0(6 `FP"zp:Ԋ;!\E6RznAV#})'<\, "<$  Xy .E RX J GB-y`(_ic$gddHwyLrI;A 6!2C q3F0] A$ YX&X,4ԖRJ@7Ɨ@ZL@A :!FK^GE{2)ؙFD^Xn#XmI W)Y "xPg! D\ &m*vYAIV 4DBZGӧPSQ4˜]XkbBVzux"ZF8w\m>7IwIp|H9ƶ IfP#Y%[03,{uT0 ի($WHF8nJ'XZ*+9@IvP6Obs6( hf"ٝDi]-Ѯz1 .YbP$8_U)** a:q!0Ⱦӂ;[u&~qx}R۝MkZp 11%v muZI|tx 㙐‘6fv`-TW%NJ=bDBJk W y,j V/WqA r TE"DLԴPyE,|0R TZf`zOx o}AxPV˽V(2 c`^,TG,CG+Cbi0 5bTe9* C$Df?En<13 c`A̾ ^{@#@zvɍNonu^uD! 3tҗq YtEIe`ExZ%txE@uBs^OB6hRVUf+g@cBɁdArS<N PH]L⬮K,@5I$8.΃2݇.-; A)"A$#`<+ 6<| 9,(VJX''KRE fBIQ[AΏut=̕xf,Jyѐ6H*%U{AEY6t or7&&$%yI*zcHt-m=` Am­'g+?;nl*RU&^KL$@ԵAP7]8GL Fǖw,2QGuF(Vo%zsQ7h@Ƹzly4C{xu  0;*R'BB5raDŹO{rg.#*XP<ԇ`.Jk$=|­P`ʅ@=#`5T~^úfcY!|^,`E]("U; WYe(({]&" C [CC!DET!+ UňЛ^}^נ-й ǀGT "&qt,[19YKCRFfV@jhV=9K%GEL^ I3d: Tcdҵ =Kw ʖTu.zM' Tr$AW>F Do@(KHJ&H ;΀zQ<rpJ$nÀ7j|jm{՞&QP uJH]0J EK fF[i1Q"WR*DL0IJ!:ff%ؤ!F $ H9uAb}t\PkL׹|%UHDJ 2Bpd1Ufիn~Q&7{qd2_l!Mlł-1&[7?;dfBƝyhF-kdd67痳c.~MwFjY4y~ɵdi]ϯCi < ?T17UZ7? NUQ= ?fo~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~/ER 9$?Ib8~(Uj0~(5QC}~(8j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~j~Py:qW`P@v?PP_1LC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC5?TC}a~h#1:nmLyba9}i?DgY!>ygQ72ui ~=\m`=*X@ θ Xf ``: ,qSV%H'cV1?,avNE49:ϣ}<=Nx L~ػw'J 2yw@k oaA*>M|)t_4PLqR+*\+olpU59:} IrSMځd\bnC'OC+^ Wռ{Ǘ4[S7,rgRTr mȜNZqgNsx/d v,pٗ4Z9}u W/&7d[jNA[lj_g ќ<7?.ªWs_.-u"1U=!M]U}Չ>1=w.ݧJ;[Ȁtl`42sv[fewĸf=׻B;-VX'VqX͝C X#qf `j0Uw:&ֲ@|?BheHX.eeS^bNnQ}IԸbU1ʿ*ZϽj7Ugc9j^Or$4]SX2޳ VSCkMggZuA4X>ۑn- iv0f48Ŀ=nuj6Ń6WINk2?~W >I)#-AMK}]xo x*s >Vx}P ZwĄ卝rQ0BhV bMU2V "1%"b5ՙ8ېX7o(8ZGXPCs z>v wh%nw\N'!u0~O,9Xn|}>d0_پ+`al}WRehj}戯Q&_l~PV;`OX-% v ` R X+Ip O{++4d@y+V܋R_;$J̒Dl `n0*E4,%XyXN3;YlV)i(`5pH1L0E߂q{VC1(8./V[UGL+܁ZVh4R# % LʁA+ ,iܫwY%zs L-~p'VYg!Fj({`qʜ>b$+;y`%IUZV[1nf(` $g+@[%T4ہT9,-@*IFWKFRv{j ֲzrs6v!`o뾺8l~wj4e]`VH4;2Np9y.h9[S{rm_ez7x0Fs5U9bdw Y% @*lf{p}l=5kXh5,d=?F@sWďwqϻ3?-~H۳ʚ?Lx9XqK?wسӸ;զiۣ~[:6dAy'>ran(IYA=;`Q'g@r;ۃE-$.2E;i`xhũMKVnFQO}a'wx]iXG -{i!t/ u ef@]fWyȴ:4Z.09Xvrz~1KOd<5[ǩu{j6]t[,{tet[_ONj1vFt;9 őr҂osoh٥C~>O\8B}sMѴU@3Y#kud"]'ok=lzfSozqHcwx?;D?K7Wpn;_/OѸ?;P#XVo4$zy״[PLiM W[\Jᤵv_X޹CsoJɺneԎN1Sf=Zf?!Gjy`zͫ/UԛxRRQ¼@"Axܮ u;ϬO 5g 1>GCs~w e^JmɜL^$I2x$]zN{xMIdžݧ4S_>^m@M㋔$bC KsvSgs-$9ܨ")L.,! X}p%c,FI{ V9E+t؈V9˙sF l0J2e.ЌL$KNz h#ZtIk (զX*EB}^ьsfTEg@Z0)r:s3c9ID $Ҙ"qD38fёF(JINqUdٺL2TOD4G0+q紡<tA0Cm' ]lI%$I$D^ ao' 6ČƬҧR)!td~l.5„ݘ0y+)sNH\:0QQ~ȗR CzPK-cLLj()F"Q Ⓤ0E)u1-&f =jvrԺdJ)mpj%bv(@K(0eK7;(&r 3`cȳBmTzr%Ōq2M!̃V;@C>@B e "t,:\"(, 蠱#f7a0 kDRTQ6% L+, q+9nLWn_iddKpgԌQ!]`(  VPjT*f= 0|\*Z!uϮLD)ԥDTݣ.%e L`#/=kO(@HWEA8PRil,@x@HtvPXoX+1sՕb1A x3V{ -eoڝ#eیYUI5ŲԠ B {aZa휉4;o*>o;TP #jYSϻ uئ:·K/(yi80JJ'.  -*=``j1pLq]P찱ڂ|Q cA1J#H*HeV***6VYѸy ":t@25ǣ&@܊`mmGw VgGESU_~(qYQ–UXɜ/+I!Hb'#҇mo ەn:1d*,hb⮆XM'PG]rm^s@ QMހwK }Y]E!jUW@ @֘ R{!UBGm` @;k1#,PxW#3K:vU 9GŌ:hX%Y֨48OithIMGhͱK;l,@茚$frP%&JI>2x2ZF"v7H-28\ɂa,p0",0Ljp#΢bt",U@k6֜6C:g0M`QIRhW jTrVYAދ,:h,WH6 "kӤU ak C:[i:8lE?Hcy٭N0:]y,<_]M˹1%P`MW-d0L&Sas`*?;5,zn I<5Qjhvfc\`ʋ 푾0 |ျaP^o* 9Qpy%\D[9DKh wPz汽DUZ# RЃa52[66I#+dOћ [QW*ϕ+"$r iQ'yMneNV(h(cx)U,JG52⓪``gMSQ砐-\cGܠ "e~819EKkLaV 58֨Ur@qQښI<(0SLGjJҸh ҵaQǝr5.6`jh o:k/5s%HL)6PX?˙@̠:= ںO#0%7aDF,Sk2zJ&`ٶ\⌞TVo/h .Z6l!&\Ko&b!] 124+&)"0"pŒc8.PklMBר"!w*;cO6w 0@5a[Fuq7.V~odʥNyf;\PCgyk0ƷJgrzջNxTv;ըS gBRl6M䝺.٬~WdӤ]t]G7eytHFv{Xq7 94{GV8_٪)Oa:M_\z*uҵ|JB4OGsz2BXاȎ Ad:DqICBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$\BDOI%u`z:oԁ|BX% uBTZ#ICBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$b:@*I u<\+Pz u@1!Pn-# uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP.|y ~/{߷o~1)S)y̵^s{fߔIƼ\Nf65~?\[Ņm:_mZ/: ĝ3 ŲjMHojr8;o!/wu~nlֶ,w'S$iYNjh7ݍ`+Qԍr7; nWw `2C\"oUnǴGկ9CumbWUr C[55_MVoggejA}Tzj?Yw(v*xӁL׮i 4=g }ܞNjb(S[Km{QiI: :sZuTôyBT`͗{N_n1S#=`\쒜E}]v*1E=r^T" JdN#i:F6qFN#/~T7}C6:awRxve+<-G1Fш2U|:r'|q1wxp:=~IDNorX1iaciď%!1}Oz ӭg:}̈1^O7*-˦|XO5mhWi~)~nn qvqp.%crL4_dۂ8uJ1BۙTD9L귯F:E??6%Z 7 %<-_^=`׏|>>]=䐝8/ ٳL=܆Ëɿ>WC\=Ɖ8w}Ç%wm7?7vOOzuޖkE5αC?["7t!~glvkwNZrpb!w{SzzE{k*drmGt?s{~7/owote69=@&^lj3X6'}p߅zۘC_\^b1;?-kafwݮ1ךfZw\kMjzQS-\ ɪۭwfa!κK9^2;8\j?wfހg?6/l@uY:M' ,v1DѱFXT}DOђ8G@8|&4|6؜;5Xw[!|b7$K %p8[ƯpX.)<ү:OIl mv9!6 ˻eLuFZv}4~oRlw׾?#zHBlulHeF+hVXiVXiVXiKj埐y-̵=ISU>ĤY)5ILјBY mb/Vy-vzsv$y0=~㒲l 5voB 'Zt1lBF=x9#De}wmur3N__#^Χ?q/]'^u|b5YL]TEwin!-_q.;^Hen3s7⤶7@Nb(kxIp^iỡ'\zyˏv\6vR<3o;lTp]f'"8i#箽j8ãrTB(^6/w6fqh߳_f۱o] Z&v]52_z/7\V~ǛK@9鍾G,AEcqyܜE}˄"3eM#,2HF("Mp?M֛RȀ΄ѿ3'\T[}W9 U9m )aR5$OX֎yvq)Ě~\ E>QU85Of߾ͣ!1%A򔈵ASD= r qSU.iu"e@jHnLR⼂6I&ߠ,g+QS\Tpa[6MF F.S"yx(w1~۩~VсubuE SnVk#\I`:Ii$: _K17Ij]eYj(C3Uހen)/bNz6!FߝNZ+jAbmkQqm¦mXrЊA+'n~OΊnRc?}%ڿdEhvE>? 76XUԒx(?(;mU0ysfMZݽM[qpޏ>LoKJvzooSAF|t=t9Ztj//# f(̀60r탧H>??s^%8T!p!d.)#8JJpt2E}%|R(aq]g(w+;~w Zhlm.Rp"3(w{dA'YP\,!"=W F&v6fDLVk7ߢ~ B+ms^FA#tPAl*IrLa+Pey^P9ʓQc# j7Bk[=.&@dT\$s%T? ă f%>k*ཞم 8;;WSTL&?߿n OTVmgNܞ kvy4W!q!r-LnGz;;=gh=_o[F`K)``Pr"޶%ɰѽoL mBOL*-<=..8e^s&hQ3?Wau| 8!}m9Ke}=XҌ5E4cĂCMBj{aVo:SݿGLlޞe!y[&pԵRoc ' bm?"bW0^or 8BQE]+r`mCUxzW(-h_[x7(,ˆp&>lafl'cmlk>z〚 l۟?8{`Zo@WQhc_]$XxqvU0$v!c;NzLzŬG>Ur>T5'M>5Q##Y1ɩzĽ1F5J:HgoF8p\jks)KZ ;3ecx#ƐOU93׃g1 !a(7*CB U$Y Nd16$0Aْ]+!~+!ďH 90@àJL уFawFE\e|ԓnd4Y:}ًDz9ys`LįrJ]E^_3̿^lh[N(^FP 5Ӎ YCQ+yVիyZ_>致Anu&<@ć#ۤT6p\N"XATrOϋM#uB=owT &͒"f$HF""0pYIoWX/ GOUmcQ!()*F~ܿ<= Tzk oqcz*FLcZ^_DR[ਮ2j*/Kc5 @r»bcr>~' P 'i2Otޑf/S9G_(*7gFmfsk>3cJP.+nncU[4MZbҜ\Pj@u7p3 Tr03-B T7bK110hbL[פyϯ: x| NvY.w)ƶ*idYD:0ZlMCw{QSmkT#(x@F>,XCF^޲' VW/cA !e*6IJJy/{ǐOժm_ƌȍ\%kHPNȥIp)q>70BЎ(xq{q;TxoWaH{`ux`Ƌ3G~4Yǘ$=&d@F!L\+悑}6(zgIl6# C׶nƸc(1峡Fb13(W#ڞPHIH(I8U鮞 `}#]B4& eUמDP:W{|"._ybO9iK͖ D5c {Pu6T3^t=c]1Hp.k: P8DM9܌ fZ] jspk6_6 5H[qH:x DDNb\㟾ϻ"O%S{c/£z iB p)Bzz``g;>b>'gvLzQ^3 &BrqJsEi/ 29Q`ۨS-.5 S)ɀn<G<|5?Dԯaw"҈-8 6b^#~vm o*[Im vj* pμ|" =L~Olor ayKu^誂Ĉց9IWif{t:gĐO |I 6KX'ऍ;ќWi~]۠Gc"C {Ϩ:RE}-g}n_^otG}#b=ƧQk|o8<(jv}L$&0t[!rNrQ.yxjFQ=ϊalddV?[t踂~츂!-𳧙="ynxöbl, *!P].dTRyۀ<=.ݘ)!FD$HF ֲT!V1(jV!zȄ`Ț*XEw|%4c Ц~#V:؄~ /W¬_;QU vBq5O{/ N>ۇ 020x.}/e?|h/#{(⡗x!8gz[ B7a:Mx"U<'u1"RL(Bфb1>%fFU`hdT Ų9c)l-ƺVT|Ce!H5%)0&PbejR0/Is(^FrQSUٜ. Q=Q5yQ$Аٞ1[w|,[?Y{4ϱI틓=NYQ6"$ܽ%]4 _SJÆ<@|D˂uAmT\a*U5|CA=ȧWSKkˎc+MۘAm 2H|BP>K gfQ.'*3:c{Q-3 r qSU>〛zj4N83N*-Wcں#Ȩxª. PkQM2õ"~[S6G)J[KY*<:"Oekp>kU9p|! _L s {.D.0E>QJ NP <_|mA#QL2]vņF D+S!]JhHLFvJmjK) ct^!1|R#E;Q_RF;Jx[;:vOk@H߰,۔_0&. 0e\c (pB̊x 'L)ƣ1k`T}DvDAAp$!\nJuu@F!x=7D3!_K } \Sct=5[. Momz@ƱF$$T2"oqFEhpwذ(`pK|&H&vnݲǒeWlY%nQ\4^*5^(]r?vj~$$ CNRa]d\ϯߺMKVyeDž6 U*_aLl~0ofy&Y5ϡA5|6j dЯ_@z9]zPyQai$4fa=l~+>jZǓ! 晴!&坷pxK&$/@W}V6B_+1^EiF/Uc=D&7T/V3nLhx3K_Z(])F,S-$l` 1`pç4RGT@)`|]V_wk$SK78a=ݙ+'Ux?\{ J?/읣糧 ]Z}pjڝqUw7zZZ2R0cw{o5"e՚*FnfWOqkUGT\<24ݒʏLX.vRz+mS0i{@U_-`_-N2 o YJq^ƛﭜѽks da)/M@m[Y7 : ݕxws;lYiaze:IY* N K>`nU;\.= .2\ppFKwNbOpSdqtcd'lTY.G,bUPb*x1&p<"=!GQ=$0{{e=_te,Utќ h9΋J {)Ac)ܥ'D 55"Q""$ܧ4hQAÑV<#hE~eE{c#{/RRЈ~$D MXl9M+H~$R@ыx=zQg0^~% NPU*E%`(#0t?2i_GThT"?WE><SMUf}/}VDq -UL['ܐM#Vh5Q:tWV'-~t-{x\%`(&ܛlE&L刭f/^,epz%2P"QDJY-.yF+[9\]ސ5ͨ\pbqL7WeY&v_qL6zH:Ma2I83Ůq Fv FE;s/&6/eB`0rN!ÀQ>EM\[)OS ['9^بt8=<.8;:v12 ʃRvmE7^32yjZF]7u7d=^- sf_UQ4dP TV  L˳Q.z-h[C~A8pP:tXgdrm&e zx^(c E'*?Y`ëq׃66QxEhfk: \^6UHxgLaLߖmR/jPR§Cz͔+!^dBԑ5XSJfU/pvE!~NOeFIx%mm'FTNaHՑ[Sg-Y Pi>CI GI7Bzxt C8y`}yY jQCgCvH+.dMm9}il+!j*cBE%,ʆ1LIrx{cQl.yrr[*|^Q I9rKf/z6|©bdDg>%\)I.a$/uD<l᜕0џpPO=O%$n'NAV4OO'X8J"lfZjcXvzyG Jo<)9#\&a (koSSEDAĈhiFzx ,z~H%)6* BYe4 zѱѳ?S^{x ,sopOUj6+]+հ;1VQe>qy a>ˉr[ugA>+w׾ƃ' [/^&W>̳f&gzµuQ,F F_yU- tꪮi |-X3 f$&3B${-i E| o o`nNWKc6 ƆՃQZ"w\~^i E׊8  ndN*FqXrFfMïE52n[c)*C#ZSkEc1`ZŎlJ+@BЧ^L<ùE쁾87sbu0,(ݻUm*v{x ƬUS نb}hws!kT-ZQ|I+KGSSPkPDcú#,(sxXDNlB DfL6rZ. yhgzDp۬^%Ջ1v \Y7F&2 X*сhljqj] tC܇n/g*bg!fb=|x+ĪGtt '9c'Azq"r!U!  ^g$YYځNmu @JJWIuh7WV`&;M=bZ8`/P=J Uf{wSy5z!mϫ1Ʊ+o Iool+9$vntb)墶 IlHo ZGѥ>siD !w9P1^Z$k1D1 qX-ޣ&,D |)qi/>8I^j:zAO}bI$LvQ?f,O`o<-Ϛ桻c0P;TLben"E qA~tiQ鮃hw@GT apoC# 5P)Šh B\8bxpphpCNGp~'y/[;UVPėBwV%*ˮF>^9͜cʺcX>a'Oؼ;wz{l)j_;ﳧ=Mkjn^;Ö>Sq%IrmU.Fb٤>15&gbo1JS-+hE܋LE<&YSgIzz99Ρu* ?t1)&Ȟ~~:twcjOD ҋauXg+R,'zՖŊ K9;Gr:|{+Z5*.£}[{DLW]bZ, e^F<v [/ُmQٲ2r+Ueuwc {$6Tq."0-n1xRAc҄j.2f W J8~I'[^{!m<0hoL;_JpWw8Ȥ|酹 I?պMwwcmC TzkńtZyMfӀ!*Ϻ(OI<_ jv D~IB?%׈AlW [`z"".,]1ފ`e5}jQN* xb3gݨKݕk~mUS;l<X fT2ع-A_sa~]_9hrnQ5HfDVy?Fտ1[>f&9:)&d0͓$6NϕcjgΊ+/Sg r11}F̾cEd#ytYŁޯE MR*{/!=Hq_~46lpl}ٝA@Դn+=E%rIݔ1hdb,VUդl7-Ad BӷYZ<ZG1ܭ /l{-be;7+[VWrg$ nq [PK[H pbYڣ[|4kh CnTgFdԊIMk_i_vчVTlb*ΐLAgvm_080wH\n^\:}H.]_Vξ{p|c'0ʡ9c:I(v_yx>%+:{dI\Kqe ĭA"n`yʬ(U"Or% ʔa'l6~/o;X ;X)ɍf2K'Y*zk_أ8j3qpw`J"YJM (k(hw_ɧGUfq|k #=b(=VH"}CJ?ƓY=tɤFξq/曗K‡hk x'o]Gx/hmm\.tsѺFU sxV}pL3 uw\ju|siμNrLj1V (^!%[z؇Byo q,Bͩ1 -7v~%&u#cwvy&uJ\5Hp5G/9ɼOBr9zLrx%+ T(%8r:^J @ٷT?)C\Dd Q>jngwE@}Emb&˽4{Lb =% UReOݟ+Rk.&g ̌(X,YZ(TӪJ1wsaK&`"){衞a;Xi^0*2u +o9uE&u[셖w1aLLQA(SjPC6`*d%a{zVH+ed*alfHHPmJMY]i0ÿ=d#;Ѻ8Wi}:aL>p^BW~ ,Ջu a$3jݪt.J*Qq$3&cxE$GAZ# "37ipFu$vǺDvP-弢6vWTkT)m̜=kf+ 2j_aH1 6 -IqJ0(ul{B+3vɖd `e +S-ɖ>,IgNIBy{$u$/v`.9"oij||сR׾jྔEKT_~lG< AB&q1=8_ѥV#:7:axLNM¥/Km] x0:r1gpIg7eم+sAREX(d/ {,e妳&v\=c b/!&祺J|aA];×H1^xVZ~gnɿR> 8"p݌{%-$kZFn2H92$1ԣ(29"'O$ȕ5v1Ǣs(-'E/ޑׅt_58"tW|<%i0lh!ӫٙ>?i2o &\v.nЇ2ƈ.% chbTKX +W!jQalvcnLNȖ^$R1$TX!o1Xqcd,XܾN\;8C2G[y<8-}pTaAh,&jEvF; *5V.AHjlد&tq&JmbWɲ_2L+pѰhAdwg0uD&DPb@Io6//Vʸm|ܢgR:hz͞4ً& b3IǢ[͗Ǝ-t#dќMWZl#%%"dCz.>&-BB prΆ^8AG)@m|Ѳ-\k~1u,Ab0 ȊBE@?.-ee p\d"2]“*Gl˚^wRۭXcq^FcW:76 '03`E*; qU䍺^ :W58unc&erUpVW'~qJȥ|kZRJ]؀$('%V;1A1e*p΢z{,+o{K2;'y'kW&j+qH=.fϰ"d"UdGBa[ܽș9+ًs擱~sRh`8C ;9wp -hdYa4Xurb1%!f+~pXH k,X &OMtdU,<ڲTg]}L=BO~63a0Ew j A `#}-eRܤ׶ lcp&p$ۛ_yY]7OՊi5yz Οn~?F0ᷮf/u+qldu Lڃ,[G+3b-)r<ǥW&v% EY>Օm9p 4z`Re` Jǁ#c- *8HSr Hpti  HW~u8r~pb_>7]ti_ž꜍MrjV IciuLu<I7zKp?#}8"&;bCmO\2rRS=8s07QW~b@'`)6#]2(d\) _y9{&OㆅGOEjų_y:Yb*Ir1+τrNJӵ܃KAq=%h;Uh]l'k6GPkX|┝o }|B0\UI,9A×9)[K6.MF3JI郛_u]?hҷo eݨʛʵWDLj&(CMW516 ٪ن.H:#ttRx؁'2f ulOH>8j wM4,&H1E54*#uw[VnZur,aYr\Oa"q8l.$>~e\|S7×Ugi\2*"_~\ZyiKݚұFB@ ə+؈Fo bxHhF]p8&rrqvn ׆ D˙m%WͶ+=?d2́2=RX0L$k!01C>~e\du?&Fep 0h&a:6L7huyiZ@UX1Iz5f'Ї[jSL1VƀGqueU9% hbahY8: H W ,Q}y°Fep geejW GN W7ؘ0fkhGJ2d6hQo@Iu"Fa1|i@vt {A[GJnQ-#EEi hd<ӓe:~8ǠOF o{sv>5׳D*#ċ&c0ʑf6 Ip!J '+"e+Ugp"[q!IBF2JlШ .~10(}7F E ^hrW2~ePh rZ9"l,'UA{< &$g8fs|] D# @b#0kШ ,p)}Tycʀvc`˺;3Q_Dm^e-RI [$h 5 ͅRLom"곯#R -ͫ KťLm5Œzq} .|D>+T `3P 5:Ih 7jO~SZ6a-+me(膥pim?^9aa(" <$?,Ztl4j05ts*hm%*BFep LxҒF2,r"@HuclMrkn\Ѩ cYn2(Stg KLˉe0pf}+RFp``YҼnei -b>D54*Ŋ$~ Z?.P%u71 xd&zm+"WGy6R r\C28_z6:aJ(`"_%ok"/*Qx䑬 yXƑІhl 7T(h RWT0QQ_4F_(UMg'-brwШ N |k۸uAu}# )*xϿ954*htIA=h̬ex`@ nvBqM Fep3o+]#U*U pmQ6Deh pS3= s\hҾHexyTī PJ~T#U}:6)QHfj5SD {VLCUΉLpJTkI F4Dl-7w @F_?YШ -<_D }΁YC28d~ks^^W!0=IH^C28`dԀ5 P8OS2+4&bU*{7k21mp'X~zV QMy /I S^VyhV]m}jyЌ)H"穀؍wh*`mo,mG +WX܈BRRV .I W('6ՊR8 XT"V8ͨh(׬3++bmPƭ )>~ME m4(&!+a7GBKJĶY_khTGLjQ#ÎE(LGP1؜L]j@O˪Q)tpԋ]ML(l8jtWUER.j§$ΘorxF2,t<>54*cfe\vzd,K 2{fs>m1!(T<,'{0jMFepXxLSυcn^ ] Wڬ khTG>uR t!OxnBIC\46T6N3C>S1 2FFhpzE cnKV4*C[U?X)ä} K%._rmPxIV%UskWw$=qgO,PkhVOǫ|m 4>-'( VWu)P2HMS|9|{q:8o*[C28bg[fU5[C28?Qd%?Q 2&e.Zr0khT@d5G&G9n'Q;aR7菬i dB3k8݈0 U'1r6hG*1*`Zl"t7F Fy(0Q݌8hQT= TX:|3lE28Qen8Ǵ{1;a7s5UtXChh [h9-~2J7+vs9{u3/tx6oFC[ua)U>U;ƝDIi5Gs" 9guA+/@-nЬOݸA5Zt(̏g I`0xw 1{#Z`y#7v᝗fvpc7t2 _kUk»q> .&C|w^?l kIU0bflppas8։1un8?1|667sz\&12`spx>L5X^V=l1ݟI&c%WeK@;SxUJHϝcɧ<3cD0d9 ~ >|crԏqެ&~S7,s2J2_wtx柟ns1>`=2%~@ L :VcN=4&7xzgCQas{o_]!ee30E9 J2Li+ bH]B)~Q?~ ~A9r Em2\!*g9r("VD hm08zJ)|I/T\JŁq,K}@:G1p r6"MLr@I,B($`2f4jZ}H1 Z+ Ɲ߃r>c C߆a!ngSAXV.LS4~]rimIO@ (r5Z2Ɯ /;X׮g? ‡4|t8k҇0}wvw-mpM hIcﮧnr]>@[, EQ%Q"eK6vgs8GDJzv΍=??{\9PsM}0a_:wm{|LL|#qwd?i~܆"W4$9ƒrg"/W7nţ^):hnEn 5+VX/.@2UHgi'M]m/,pru$j~i,-N.Ru"`ſPyq~ /g ]Oܿ/.g\Et] d2dk{2ړPڹtˉ,CsӠavDؾӓ{+wy.@$|1q dmbWmM!$MlXMG@v&|yN5aZp N3F?x+ AϿj0,x|8H0*l8ĜC^vJg X7?jmp1k?gݏ#aoпN?}5iF!(}E5< %r\zc>qfi~܉̷DG'01Tt_2qX`,ĵrjsr}O-=[Mhؤ~Nv鴰#AvݪVe&xthV._NTNPg Clw:7^W*ᔛyF G Ud5mDV163T* w~ /( ~u %{ /Wοb$X(z^1r|葊ͣD4ώ؝LX퉳W20Sv3fs(@ 9`ggR tϬKX%uus=|׳|ݣ,jN;>qnSb-I`В`$0.5]bձ0MN5ZQTWͽ}Kd􁓂}s YRsW!80DcTbs'Y.45y듳>PgGߝ]Ũ0?AY8OK9H5yWcxzλxWs4 Ogn6må탮;^LjgT8geS rS&UdS-uoci޺nSVL;ǘ L&G<&OQvuL"}w.}#S َ`%YXSN2屓lqQ^?=[⁦T$'wwF\A"2zG\OZxtݢdkPn2Ey6kVR+@W4lZ:UKTKTKTK-pSg/{ج2/6{^lE^l/6{P^l1/6 s=/6S؊[b+^lŋx/V؊[b+^lŋx/V؊[m(^ċj8Ɖ + ~pj8W'Tg:ٙ]ŽfHX]$B-m=53 j jQ(PP`b[f5SP(Y ԬN jV5Y j jԬV4(Vs*~(Tv{*%Aw&x*zXѣGh,+~5] '`) j(@PrE2$kP2&6 QetlFʄ|ԗ;yV4AO ԸSc)>Q'_ğ2y27)yRw%2{o| &YY!syʒb Y-itmT"m>J5w9 i-Tg)Q-M3kHθZYGI&X.:.`EHO䟖,KA/%ND2E| m DŵPa-53!D,!JC'BA`\)&-(o"X>fkCM?,샠'y(&vfRF]>FEr %$1=JC ]_r*#nQ_g3=BlsسӀf bZUVHkM7TYISS={))U|!-/ rCeJM+BbH 7Q,޺P"ىxx:{5VSѨF8UٌPqySLQ:(VQY8D:P:?O։^WHgWWpf^fk7nx}"1 \}7M/dw`s%E>avm%JEy^1i!qu-C ⶳڗ)im QD 0keJEy氚'q&,f垁L Jʗ)+ogRl561+F7_q PD u(o`nkO\I F8ɘXM0])< KPvF\w(1Aa1be%JCylӺw~OnFWXrQ%X7Ъ҃OC|Wk~F IIv]~ݳ.!Ar3I"| |Ѝ5WWiF'2cvB[ކx%x^kkqpAo V6YY9B㝎ԑt9ݡˍc3x .Y.kMt E+-lMQͻn!FjRetFfZRI%me|w 1)vxq)rVJs9ƪ°pSbPa|@ Q7*7r^a\%gؿ,ׂVl˅}H 1w޴ii;J:г5dXCZa*vJk\0׬aڑjd`-:h!.zxkrkIYg!)kϥ^H 1p1SWԞaB[byQY+BeGlBK;2ӬR"*JK-b/=-/oZBc~3$ue|@ Q7{ށaBcHZcN5(9mJp8?U:B x]ee?dǬ^-?mKtuBY3zS ت+ڏD9i}'{k*XoObbz=ZTQm謽tPQ} -d8i*ĉBtrU^wH 1^{+b) ˠh e en!FBx93'ʬ,20숇CZa<έsP<c^ۯPTOxB1kՠIwb?vbe;CC?UEB[UӚºJ+|b.ū:6SMx3(_Xȃ^XXY#RUw/Uc뤵 LM)yx}sO yct_c+>DLlz`gI8CʤoHW@?9φ?A&8}~f |fKK1OK*z}H06zc 7ScJwG2 tpJ]ƿ RбL Mպ'9ڈnzHƺ<DKտ8c+!i]ZW *R튘!VqpP'cONtɇlD$$+К.$[y"$7"ݍ}[Yty+=UV) @Fdt!`~0UZ1>#>sL$9Rn|"kxpezC&glQ߷\-p ;`e(lw1.<1P^z_ZJS(6PLsMqz/ !|V^;V_dˡlқ*x@zBt`U%Sᔱ;?22;C+j~x+v{/M+GN㏴;^O[;%t [MTk{x}e:6o3 5Ax?4Fvl4wQmӷԋ^4|,`;NS{j*khsI\wTN!.qƢ;ɇ/N\2Oi&`!!MD!^[w\͐rx!ļA0_ϴxm,w&(Xth\̑rR{1?@*a:0ƃFy)(2HNH9uTt|Z+lE:\LtEj/&} W#•u?4"ql-|v8ί_cGݝot2u\T8~HJְS.m~W@, /?緷+vIfu{q!\`-t2"+RqE*O!RڻBo-+k9@jqE*AN!RP0\T{?*\Z3 NoLj+𒛔`HW(Wp H!\`L2"~O-ء UOdN^@BBɕ2\Z=TZ7jRr 3u+x*" +T)p5B\i\NUJBwHS9S΂+K'5*P<\\R~B>5,r $ 9+Rdpr+қAU: 5&\Z?+TikKW(X HL&"f"vƈ+ ™pul5Nrsq\.}\a\Rą>R lpuܤ+,NW$WTpEj:H#L: Q˄pE-$+y*"0x\JJ)o[ @QߊOEsKlYY|Zb/3v佘f~G^~)5.sfU-3 Uee׫ٟ~Ydo|b=o|?_\7#kb.+_[bV  )WL却6of=fgֹE.>ׯߛ./aNip˰:fҫn*7' "~8Ws1";]JPWo,kE餙aMaͼS S8ª~= op~ͦժ:^e50.3׃K4wŧs9W$Eݿ1L'-s\=cZŦw=q>< Lޝ'{N/[g9};g?q i!|0 h־h9,${!mAB|1}OW0&[ΎIF]ZQ$]:)VҤ΀ub%㪜iT糳'Zl2uvt:ug~_ n;nN}`uyqp+ǡ}rNQU{էv=wBAQ+ے? m{=+B)Ձ ] OW5`_tEp ]Zzt+;dу+P h%NWR]AR;DWuZԾzt|HL2PW0w&Y[g>:ga5c:9~\%~?n^8CڠRusc8b(u;_3aݣO.{f~ӕs-VOS0LN/]o^?ˀtRd_om!͝d(|f3ϧ˓Ŵ~?|e~<{5XjHnGyfjn!$XcsӳvyPMSâs~YIu;O}ëaRbYyW[w0\lΪW&;܂\\~ykѽ ǣ-ry8>:¾g{>g9[Ѝ2thG7/mV~MKZn~!8/нՀT&K%R%K%{jPTˌo ;Ku3]:dP[iӋ7i'R۝SnDW>#j[|tR:pᇛռh[+H߶y۶ !!X[5!ԹƳ5NF47VT|v R!sP=[~v9[O,D*#X IC;C1jJ:堭1,R"*ZNyƍ;+k>h~z70]~I/\Nsg]Lz|}_-v%֡ļ/0XMٕ*y3ɱ ,cb*~FXT 3O+ht.!IJfUHk}|UTUR: x9aeޞn;an7i{;^9^_gѲ{,.'_ybK 5Ғk:_h!;*1u֘}_GH(a3={F' cC+|wj&pӕ۱{ƙa{t(2]=|%u8z6w+wv=y= ``j3"V;]JgtJp=] n"F NW]A)fDW C+/}RPЕUJ)`p p ]ѪO Uz t[]O">QjSN%/kֵxFO'j1hDWwE(]%m,.]&M\7"_l>clSSOиyټ]eЪUIM=U(b6hu9;wUBQ~OkG') qVKUOm+z[Ɩ*N#2`qBx}ʦ6#)ns$~~z%x%KAJ]-8YDnWFXx'qăTx˹f忿O8xOI^{-@śG\Dq:+*QOy^McYAKFE5 Rk/Y{huדr^wE0!IJ2 - 1f{\U˲ ?FqelD#5gۍztlEL8eDkK Cc.:K%sBgxTV.ke:D :Tu4;;ҶT)G!qάʢ\%U^Baz1 גpl43 a8d Oey> l2sօE-p ^>m\'S$NOMʆbߝ7;!pU[VbUFBVkc$۬JX5$W=A0π1ɇs]aMwE)1/l-F 萚XBZja,31R,@RYKIEvQJ]L.B{˅Bdϲ $SNl 6TUf J} #7 ƌ6Z/b&N ղ*B#‘W K"n{0LQ\`/#ޅ .N+OF4Gg,||`aZC*[q8-gJTKPyUmdJ<́d GSU91*ޤ).Va4$ՠUf3昜*eq X Pv˴EPg a5FF{` eh%m9 - TA)zg[a':ΒwTXhJ@w`68IH m&H\|`؆|+X.*zY|RsyV͐jPoȻ\I1#~h2.PHSx(!l+Ř^̄Ü/ -TQ6%OL+Y"q+oLWn_ dd[eœQ3E'"(;9/ `UÛ=)d~EBZ\@PSZ!u]Pl-sԥWݽ.RD z`#/=ZQ$$쫢 JSDpPR 6@ Ja<| B$:Eye WE7 Y+c+:c"&t Xp4Ɗ~24cVGp|1PTRXHa8qD1`ia;/bA+qy nTA&'z|eMM8Mu·K/T^fP2Ӭdpҫh!R0 Ah)SV!0阌;`jK )t_4A"&$iY!QJPa(nʢ)H^\D,Edu@2 ~ȃ!2!wGzc)EFWE!\2̹+ c;gQB5DYl9i@fH{,YHf5*)8Y $yHm@*gU2+EXY;BZzk[ 3H@ ?{ƍ`\`[`a 7-4)u`pHN,D\=b3#ɲ,9Ě4kϋÏwmJ'9*0 C:qj:r=X wy[ tuyLQ/ӎ~s$hpp`k M\+-d0L[0Sݕ <;Xjfס:C(p58EÔU >v 3x}oh| ߩ |s8K7R9Cᆇ5VtP,C0e ` % :PK5>N(+oD IW*eOW'g.\6k9iwu6'V0NW((BTeV( F|T500Y_ NsPи ̏!61)Gϲ%sJXZ#e[2SH 1Ng= |,U%xqq$[g dSQWе,[HnsPJp^`Ԁ o:i U※ЪpH(Zx=X|͇;`)1#@J&-t}/pe.|J nK`6Ƈ( ƞZM&=HK܆R M'7 /@i0iS )QWӥ#D``.fǒYY $?E!2.'h4pQ~5 WW$Xw*9cJoaL1UGV/_ ▃x5@T.3[fP@cyk`OJMY!8)+ Vo, Ա7{po^73+6=x$ulPКzIsBřВ:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCB}(dlP{u.A ' ud֐P:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHB7#$޴GTj:KP: rP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uVނlu4-htbZ#NPGC&HsBW$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBPgy; _no} uwŽOts5?BltEVۣ+ZFWh{]qh!cM2Hlxk ZB~W(9DWHW`&wq.YQr:}0sOP[3胡¥Oc? ~>x8:9/ :ڦ2oTѠgq;;T4NT!x.ޥVS{iVsqQxxի`{շ3z߽!l`tG/:ַ\x;OpRʲP‡T UB)UςRg4^UV;\U.}Q Q)Qegh~v{,W(dV":'B[L>9Z`Tk"+[bDG41h0p+e[ Zt(M[X~ JBv7DWHW = •-th޿QZIt+Ϯỵfd KWj3B] ]=NBtm+DQDWHW`m+DܾcJzg o]H` *B;]!}K]}Wm["B-tLr Q*]"]owm{A龶 t(=HW]Yf䭡+d[ ~t(-\mKW-thWNWRPs]62wAzWtpMkV{B]UJu`njk n>n0ZՆ(+t\Ktվ>vHWW;]Jҕn]!`ZCWךz Pd]]I$k]!`ZCW=thwk+Ξv-+o ]!\.BWV}+DIAҕ^3"BVm+@k8wBB] ])zbteP5tpk ]!ZԜ X]K>Y/{*cngj؋۽-}}جG=cS϶=>N/AJ`i0<}Ӱ͓pխP]S<'7,З| P0M"["{}9ͬ JrNLhmɁ- p1ƻ2XP8Px֋36T[ "K:?.}vm^釮ώOWn<>vg0GS_AUuÏ4ƌ@f_Ş"jۿJ:`rYضN9[67_znjJ6ep%{QKY8阓-+*۳ƂpUkv Z;XbU;X+Z]ZH QJ:t)F5{Wql7mVw'D0g!rA*>cV@]iFY]f)h{3LIyO*eUŵdr0 s*Sy|g~a!e fU z:'~}s}j/̝ bAu# ];wC#:M|/v?z= Ӱ^?,Z[,1:5B\,ri N̮B0M.X%^Wa|1ɇ읇y5W=X_dV8򯋧(ľH}(S50|qVNG5W7M>4Z0Gz~v՚3<<zM4D>t^ wjl+9.`|O=j0Ã. ?~ 12J)u6) ܥ0eU,~#YW87|׸,98vg^{́]縔F$X -lf~\]kh{{w'k&#NzטK`T9S"6r0|,&2e[)À˜ _u8Pd7SE0 g&+ivZi*:W:Bȹ l`_{nXB ۫cZpYWw2 bVo$׊{Yt!JV@(Kdjj@v%nҺ7WadO_rgPΝXZEr&"x &0#.;6\a&|WMST`]?c濣)ٙ:?\Kkx/VU$So;K&1w @Ci&8㒉+v 1P<Q,{Gc ΰzU%\̼)̺.7>Ct9(2y;o!4v#ϟ;W&i+A>5}^D'˫C0֥p'8*(rUUӆ=cuՆ 0+쉰o;74\zg9c}d|1=l9[FM5ۯzwMJݶ "0ϓqKvk^$Wth,a -3uyzs՚Α[1X}emwpK/mC;.:XT>Jbp,&㕓6WW')Jglpo1aQ?1C2]$- "cn19{gl42&eɫХ*GnUdW2fJ XqZ"('W[yklFb3-bN[ˍ I̟-X]Sgqai;.;(aa_n{'Ti&PCq {ҽcoT:%_uX`qdpl'.!U"LZ-OHHqDr1Z3.=q^CPTProd !Kn+uP?qFo>}@E>in|y3iBX9c ,,"6pI&$GD SG'pN_ኰmXa勌?BB8ߛ0*qaGEQ\6ߓ}p4EeZS wAb"&‿ygLDaЄټ6`evCi55u*!p 4P442‚MR\ ɥCͩ3x A*UMv޻^2qvcuS<6З?>—.F-;h[NKݑ?̦>9^<*]E̡9Cd m͏pmhP1C}2ތZ~>4=zCL S RWH01*hSUVqvRA^R;qBEWԔs+G]^u̎{M}QHEEWw̔gq2do_ .޾C1zTH77s\y&jD))(kX\ "]f*iC_&9Ԑ`U]*y}nn^]e*i+TWFQܫ`EHcU&򦨫L-㇮2\Ũ+XQ(BlB~O'欵gdJ?l6y./^yoG(`Rp N,(G5Dc^ Ĕ6iZSc7#Xތ\g'f=؛Qi [[S_=%fjʜwWnMd"&Y%g:NEp rǠ4nʻ~ҝ< Y~@xJZ(RHTMyeuk]9#c Dt T9Jz[wsyuJ2jsҶ[䖓/O[9; XJ$9ጌiQ AJnR %LAƠ*BrfILiEFTs@*2I! ͉N:FtkXrV9kn>xPFK\.I`p ,tPѳȢc:L",EʘwVE{\(#QisEN 2%")z9WCJ!pFP!)phVmPˏs/%;C-) 5 s"E,Ou4Wl+(j=_O!,^gv6 SsXWOAK2Ƙttǐb08i>v5e.ػmWvnҨx8\hX*Cb2)bpHX=((c 8.zѹt B)jc$0pDsgՎ1e $;CE|<^߁~xkF`C31J"ro qIۿs0i Q%6C #2uݫ!B43MeyDKLJ tRU(*3}PBE/q2់c29kQ@vN2GLI6?apݛQrrZjG'$:N(O+M 7 Zu蝝f.*Ry:mE/~ja9j loQVʼu.Gُ8,m9y QWA ˽% F-%tAIt%n 0Tz0jg5cJǕOȌky8VhOh@S[D=1*j!Ţcc I7;yIr@r:>5*!p 41k0A.mnq9U+a z96rE8Rd§&[7jUpxuy]-/v0/E]ɤ@)SJ-t*0"$FU"jJHZ+h8c"p o-;16_"̦:bZQ~GZe9;Yo[w~1@<:η5pi80p`Qhv Nr[$ʭ<$:HFz1ҳ8활1ҽ@KFz ov=[}g9un%n;_/Ҟ[t't C*8*uggId7)26yF~15Ϋ jC ^vmtEn-TTb8L_#‚2Z4%,5DzXP6,1͕R RW,kBc3ӺT65+0A V;K]ero>7t^]e*u^2$c ;+uL.k12R+)%髿u=k/\Am/;RGPc^EǤ]L=3n2rYUay}7/3c AyRF(N*W6q6nkb w%@{w:{1ӽDֱ]IfZ9KN$[X(&M,zs9~L&rv?˩ӥ.5|ojbwt(eN4cRDWqU4tpm4S퀖1vB:E2Lq#+|++t,thj;]!JөS+ˬ1]!`Bkۋt(yGW/DEӋ>ᆉ㿄SGZp9=U=HtUeMVЕAW]MODDWؘh JH4thk;]!J.;:AbB0#+X Ѷ_]!N]$]q4B"uho=JttutZ[?ejl|y9[1J N8@x8*Vе]`Yӧ9MDLbI#T;U𢡊oWdjeA<׃)'|'n~m1Ta4+eREY wkXaIb L9}{^{5z; tFH k= sGEcE>caϨKwjyj[1 z[W[{׭t/a{#S.ȶ؆f8~ xJ(t; #XDb KC*1hmCtbŘ;F1ƨ(ƾ{Jm> AYgUvOlv# ϸzd8SΥU|=)|Jvz/5@Բ"lxjS[^xO<jK`0`7㫡,zV=Sd:{^-U* ٫ڥ=#d]/;7=NOy~xK߃jf5<&/m뙋*e51sM€Ù E,KZV(Y084i;?6U\)Dt Vsx羽: H|[ VB=i)(խ+5OU~j_7?dHHoMUh^$z^$ئucg&n^SJ䐈3ORrpgtNptYdgP6*E44P0m, 1乔bQJq6ݬ9*rD9fɸ%`q]2Dee<.Atē#+Ա5O.t$])J]8/ R7s#?C̍ "Ѵs7V&b3}PS >cI)'jg_Fr7u0_ Y/O4XbTe5Gw3x)y[i[iiR4YY,u~heJD;ӒVZ ִ~PZ»#aTۗg ]ZCZw,ЕU + ]!\b+@HE;Еhz'Xz)>꠵OFWPt%;|  ]!\b+<ʶd]"]1ATۗ"BBWֲDwtutŵFDWx  ]ZNu QmBW0c eЕ0hխ] ]IGCWWX Dz( JcG]!\ t(˫]]iK/tsmDCWX :(+c4"2VO0pN=yWvtuteGDW٧W+kx,thNWu /TEӫ>%SzlyӦzhőVT-KdP5Jut}Id_bD]!\ ]!ZeNWҐN8T 3+D,th%o;]!J)ҕD)]`M+L,th "J;:AT#+,M4tpUm+@ٺDBWiDt^Stpm4 梶t(9JiQJdh1*Jn{hi9_U_.w|Zl'zԱwV|ꃏf/f2=A& _GhN11L"+#"BLZ!X PRXGWCWiDDtU*BV&CWup!Z$ŷzh呖l[jAWMo~눦gB2>Oz_ۿ}}tJ:8 &s,q`ʁx҉̪H.e g2 v7+#?\\T(1)T;*TN<'xȝ0W<tTfLJhHJ-ҍ3؞FKhv`bj<,oޞ?HIEsmD_r8_?h9^'swy Bu$B$Žލ S.J-߀?%)ٚ7T܅,~/҇qaҵg<»膃߃vKQඋOx6cvpoCZ󀱈'~z nT0 Ao]d[sݝ`(7ڻo%eA2>?;ӌ+),/c:L2IsekU.8߮ܗ`|v@n$}u]8&ysJ3"Ro,PGs-zE P&(oeJ̥Rm mvY?r%OHNWv>-?- Yy&{gO-z56`qFZ8 e]vKཛ_˹7+t<N T;eG2msgPY@TbFߧl̇zEVxP\,QM B(˼ӫuƧGFC_|08SzF CFnfqXCfCT7!³Q&@=οxQKYq{]PEs?j %3, W.:L|q gIy(y2f/a޻jX[|7hˡ}LI@5sA{A9S>:ڱ2)}*Xqwٴne`~ѻ{?mt~.SĆp 6ʭԂzU7n*mT*[iywQ`vV3pq?\RC_p|z1wsF9܇# #oC#,V\=jm>~oFeb6beldt㍃T֍.Qo1U]*>* wSZRZ@q6|zpB=n TEG #VI6bė#.|)d<oP@E(Y f # ]xP aIu(g?M6 B.N Q&$%`TЉ`\eS@0@c|NTzZթifLy)gDEU!'z\s)%3 9S16)"3K]NDh ӉV% Fכ\HSڊ+Q!2优\Û/?0c.R@,yLkbr2*՞t}(q#sc8iyK<;[֦H9[b,莊lз]|sF[KgZo\7E5QD5Q>XT #.%eKqy)u\P\ %M5d9KS V@հ0j > I!*=Zd9yB+aV#3h'k7QGʨ<vH Jojw}VCU6`ЋJ~XgS7S]PBKkBeJ@ , <^}:ϩ9F@$ր|z3=,omt;ϣpNoY-0zrz[V֪0wR8RQK6%=Oi7a!Wv'ԛMG沶jq֣嫏\fN5O"Y6Hfenʧn>E4/ q&=G&Tj*zN!rIfX*#:ՙ2Z ACIƤV6i'W-f?0c9?e}L2էt];+XG,{08R %RDad؂0fH40 OevӔkJ31 ɞu9kbA|Ί%ٴ^yֱ>`rR,JnYC|$ 0:LDP4ʠ$!%U9WRy1୰)ϕz[s3"j| =kMkEv|gʻI"h3}cfi \iwba=v/q^ʮ*܄* gu@ ^Kذ{zJ( [#pRʹ3<3z[1 yu(֬u(WòsU< SjrMeșm*@fU맹vʔHriAB[lbL K{_:&$V`˩ "2-h{G֮lh&;5otB2k k:<aN+{S)$RDtJ2/]pX RB(Qd1wY/J:&Ђ##My/gx,[h}T(:I\cJn‰ҥOl0=ԎL"C2l >*ͷ[uT1KMIdEG,"P0JBR 0~Eu+{= E{is~9X7x)ɕQM| %l\GX.H"C#k#?V_jiZA1'&VpaO><9x@4ʫ׫DE?@SCxӡ5ɻLq1h\XvlYoKwJvd S6nV/Nُ8Mv>fJ;arUb6~={<.xXPv%6 p8U r+ZQ_8m~jc ^~N%@FC٭*?L_٧g9.XY¼,Vf*%JC:u/z-|Mh!zeH-1I B `$Yƌ3)fZMF6^1x%Oيڪ[7]`j3V\s'i%9!i-*+$EZ˶HؒAV `)qW$03|WܕŸvwUк7[BW$vgUW]qWE6"zJ)9쐻*k3kծ+wWE]Iw!wEtk]qWEZmwW$uWo]!T"oR[*NGwq* --ʻנ(MFrR%qxiHbđFH!AV/rp f6,7YDCsoQ;q-2L;O/xĨ2ٖW =71 o$1q.ICEđ4aN7_qtը7|ܥ`_ϛ#%z䃋c[B :gCGN4Ť34.%'qFշ.0 *)hфbCt9`ZdH$iglvNZK'(:j 8%Q^ruwmֆ ;:GD: Q}EH+4 `p*Z:@Wem^ K;"~ZjCJe|*/T%BL%BpDfJ7X`4"5do.7@Ĭ,ӶdpL_Bo' K"+Dj.gCقt:f^ U]x> (/CٔBn Kb2<)9FcP$eK/C47&q By|`Zs>iMa*xIv͝?:&9rwO>j*`[CkƜ0]p*Dɔdliu3*1 6b`5.:+JOyU ^h.$f6teo<a .I28+6W^a)^l (CB5lJ7A}B4=FUf+1,1 ZhD y˩6uXpP͓&K\R3K]_/+mYQU1tIgDV')NQlǖ,|O5"Dȇ#'r )iJ!݂Vْa{@1 %Dd-)KZsjbѺ*c|ֶPB&Wާ+/d_;ס?[Fpԉ*24BV7~h}A1'&V'DQ)n/S-ˆc+]%#rF׺C2''90—rR1 w%35ZFR=9R2_xJn'wn%5,]!`[ Jr θdb-) {n8Lw7 \L:Iם W;,̾4ߑݻBJhԬ Qo:^٭)P٭]h8%$]@Zswirvg|zI3ҏul0Qτ9wߺ랞V%HFu3ןMn/cG6nt$PG:_5X=T+W#2ĘG Y,+߻Nfywt͍*Norըpay4Mm H}~\r!)};w] =դ_|::pɟ Ó@30J[%-%9ƛ--1MdoDFfk\Rw>_};_PnGֻ0ef[Oq؉ӏdnGa'L.j64 VƯǕ`o څXOn[$Bѽ1Ҧ;lNU;V-p8￁z1S&bvk O+nxi0$oo; GA"/A0/˳&eYcJ!z1F㐎A݋gGp _fH^%RcuLgz&Xfu2*.xֆ1%k !vMWnL'^ISvM*%LhiTGVGBz}u@U4A@] ¨8+HX^ҠmY-KK,5/ֲTsU3Z#7\঴"tlod$ PoG (6q4^VF̴fd h Yxo/rdnac< n{mj-Vҽ7^ I3/hXb{X4g/]j7 ѥ._[! qQLifj84ofj4gn^L5_UY94~ѧ o*'*qUUe I ^]yffni]PR*D֖i `$=Fl:g hJ"Zvc> ]{{l|F>8]|̐L*Dj-_EjR$[ⴧBp&FCI@Wtة'%o!W֢+7CW@;t.*]!]g+9Е|t%(1ҕg\] 7CW^ѳ[+Ad>e #+Bğ I+VJЦtt%(sR:B SR3 \r[+AJPU:BR+ob-BW6C+A1ϟL}}X_Bt^T^_^d گy,2M:+}/tqyulV.ۏ X69 ]\27EcVe˥ݼ;2BӏƀV^vz~|2{=3_?c|?[\c>j7^՜"?) 3U+Tԑ (?WG Bf5{ޓ3߳R3BvyMw<#܊.6%5S[}-l_1'.77rwa}ޥތ[?^o;=TMvƞi )9ܻ|P5b'3d{nmn^SvT<&!yoJ7>HZkݻnK{㿲`˦`qaM[4a2e{JDo9LSe@C0 n-e15Ѧ1Pdž+@PSV ᪻8%ËGTWU]% ٷ NqVT0̫k6 R( -JĀA ݚ` dw1#*D"ɠ xVdž'8Ɗ)B}&=aaG~ۣŊ4dc6)8yh> (Q4vH'k2/ Xpe먀B7BJ_` ܕ,3IJ쓋.]e lâc:Z(v5!-  -tE\HhFR|"QLԴN!>ȱ/ty,%Ѻ< {.`JUW Y7x;Q=c]`եU,ԀGwf?oFF"$"hAb; ed~cNaJH\Y$Xu'Xk td;qwP65X /Vs7 bh9B/JPGY } $$\LGS{Xsv**m˩t]}+ƈv FBxZjw ,hVՌXF?6Jc/1@HDn@GIv\uR 8Ϊ$1 Yӌ(ɀ!ZP bF&#\G7wExU\4Š};,⬒%U;rZ wrtՑp4$t53kE zi/\PD>lɹOK蓇q#"cZ 0zaR s5 =A}4#Ll =( 8EژZ'c6}ES [ -+5ycZ %غ!LFdn7"T[k@׾dG6z:=(}:Nnyz ynz*^Ytڠ@2XA ֻw\iԋY+BiqV0]%#p]t5zzGe( qL:\ `S`q՘-UuR XVc5 4&eG#_ -_(`z+@Bפ&[ʡTG1jw$K^yq-n=?/hzⶣlҎ*v%15FgGs{Q uf޹?%H]If :4e`~Sԫr#O;{Ė-Mઈ6[%'g'PD7pn  tŘn̻7W~rQz=wv+g?`9 I?Nи.Oˋ3v͆Wy ^ݻQ8Nwu1#}h1AF\G>sO>{c@xքO.^~4K*&[ J;tNs' &G4w֙}>Ģ%=e$uwLr)V+fuyw95f7ˊ'Lr,fa5 YXjV,fa5 YXjV,fa5 YXjV,fa5 YXjV,fa5 YXjV,fa5 YXGkNB7d`K9C^2YXMYx {g: N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@_'K1o $v@h3N ؘweu ߓ: N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@1r0sqn6qb :x'P:F'P`: N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@z/O'z&KM媯׏ۋ \w_\?io.`A2~K%f oƸD[d5.q铩A!`HЕ 9qM&>t9(]!]e ѕNv3t%pmkӕJW_8!t >y럹|aU~]eN!ٸ!`vv3t%pnmLNWR(JV-+6CW7ЕNWN:ltutEޒfJz ] p͠tut Mݻ6CWx ] ZGNWV P:`]ztly3t%pi3Jв=tlU9] g_&09mֻ+AIF-E7DWvԕAAOMiCt EW7mF]m0|t%( ӕ7_6`a祫y]=5da(-V3͗k3QvqCt%c ] ܔBWӕ$tutd7DWlvԕM >JWCW[yCt.n3[֛|t%(+]!]'6[RWvA7 ȇNW *]} z=]n9BnKNW"nCtfJf]mdPPzmz3o/O_ɼ{s5'wW1\6>C ۷g W8N qr~@Kio ^ݻ]I7LB~J4{_Y@ rI7_\ϟ^W[J|ܞ|7Ϛp#^ׯ~}\brɟ˿~NNJy˥ϝ39ٛ#JOh30ԻaitrFgJx8_!lC}0^y1l6>M>S3 EVđ(i$MLwuw_UtYԃVYVQI.Y{ɱV5Q3-9˜R 3G2!ŋ%h`|0J;+hKekH]%ᨫDӽbG\H%רZi|H*,D<7IRuLTک/F]-x\]ݍ`w$Wݪ;Q+񎒇F%ݳx|u;uuW]GWi #)8y)w?Ew="mϟa]+yYo@ʋeg.><#4)I8 nR+҅z;1nJb:2O*\8lgGٸ(&3}7<ځ퓺\Ѕ6H@ip 7ӻpwϺ'#k'fJ8*ʕ/kϋ16Vhv|a&nͳHyݬVO~Us:rFBBe,YJL&0"4nsVjyuy9 @cPbGK1]ݕlȲ_ _)i䓅/g\U sMT 6]\pm@)p̊~ouKl=C_˚wtD:ӰSrolήV.*0+TB[ #OhDY \+-i-uFh$hIDtc"{&CDX{))@#%b"XZY G։v䤵O2$>75h%{<U Yu-82{Uoգ<ļbЋ L*&sFe1Ru:O_b&e;G!\bmp!Kc6.h>*.ʆ/ nnRV+s;l[L.ep۬$?59][F7kr_*b?Wso^u HTVz d Q(Jq@XaVCx%.Wd.'Gپ~")?|l €<B'WFhF3ϻW8 eqozRe)thILVj{S+&{L&%(0Ldo| AKцb:yQzk1o#}\70e/|hz0y:'i.^(IϘ X2;f_o4bh (ˬ1c2t0Jq輘 ^/†%ϧ_f >y\+ԴއdЛ敡q Eٚk7"y!~eSp? J9DŽRn&x92J#"( -J)Fzxm7 *-\|C~08u ٴ'paq |Uj$ϓ֐_yI ]1^v5kw/nxf6ۛnkl>&0 \^LтL/2,sIkwV4<03mCcm99 i"piRܘہ5[ ~桿ԁ~]}j7EܖlW5qE2 RL!:LD-Ei4 3 `vtܶo?ܑXΑ{#kwQ㍌|>!֜,!4hdgW=χYbwTVa>,KkEruq_Bp"F2)6F]? Bޛ}  ? \iAxו/x}^|_snfm}$hsFvjٺ xvCiH;E>{#OGࣔCjI"y1lzJ8bh-1 >xz%U hsܸ=.uUV"oY-[>ni4!zH9EvJ9sJ dK!Pz!- s;:V u~Ƶ|nat`~F)\ݸ Z+i R)`VCa@MC4LR1\tƽRaDcTdYFZJ-͈-}hFi=#c"X.1C{4UVr%qLc1k5I+PR2$PҠmPV#Ρ$b,-g!)fיZ#C& i#0ƼDe`SLoyoڣ)ꐰFC'"ӥۛNp4lx1eCۼˑ "jp^ j#0N+% DEgtۚfԳ/O;9;rC 1[%0 4@j.̋a:j!iU JB#DHD"# FT8G"W BJ(2,gUr Qw71*\~ YF;AEȣ73LhR:"A JHN/z =cw*f0iIA A(RXbpH1&[gi.}ƒVkPMFva>3bvL$! ,eR 4 <% [6J; FҦ'ትBipsԱ%|mlA6fښ!k="J0FQJ5 8f(t'mED8d-Xro-閅OυoCX; '7pW('4[+<g1f29;hO<`,sŧ}8Qɰ:3:|sCpKVݒoY% LYێbۥ)H(.dG&TGxpRhcK ()LϘT*&G`PZP+z2 Mj>D&%)XH"Jd bJi lo (WKM)0q")5(K)µn=F"vt{j^zz_߆~aYKL)`䍴T&bbut!)_@o-Td}FHxhZ亂f,X^G*")9չ> 3âҌ;$8  AyqwNJq߅|N^z-}ڹն&ʩ ѦNf+ן[Fy9ZqTp$t/?2M[CB V# +b쬒V(b:a:.Bd-];4V Rtg[k$o\7g+_F\s}*_gÑZ$oKoȻ0SB MqMI ·U ޛٯgC.+U*X*y J^%"wřL2X3S`)" 'MYs;/'aſJ! H ˥CRr%LНO|]Jg fSD!Z'&2d.rq4QˡS0! (5;@i~wov{ 3*\Յ`Im j-Ĝ<z5@ZtRy20rrG!i)󰩧nHs77vSgSDQ0i%/ѧD>McWnu1Ȧ^*1t? Is`KMOK_"G{kz Шm0]/<={y/gzًg_`&ա&(~0 >@ET-VQ'jX8r,ni*FmnXTU {D.;[ZVvWc멱EH=-QXj_̾f)nqŻ7S,`3)YY$XLjAuUo:c|vtݛ ZGK!2f4j|sWL#,8Z۩u㾀37ۤgM>>~,h 3]֮ˁi<y i)o/Ru<`H3ޕ:b2>UTV60WNawLS-tϿ`J>>&rD{1QX} Z-t2Wq&O.q݇ AEA~j,ZZ \%k~a?/?MQ]ԧl~9Y窽 )xa}Β9K,YwmI_!%u6 wkSBRW=Kq$ΐ`^g6GY,̭L`˕՞:KE GD1k"A B {Ho!}u˯΃2 sP]':#>[Ti>g4S'N )YwE.i^q `\s'owIwo͙p_O면|܁bEjb@YiW>0mHC#̞|/&nٷm"y5^kQEm3o$WC;  w'j~>Q)qLKql dH7Snmg|.M/G׽M8كćh|ݘOt`eПLwQZ oG3.TsLP;To^5-5.չߍz,CgmU#)]܊(KqksW2;gq6Lv"Pծj'cL;=RF&lR0.3gf4pi|ve8R}cɁ<|j aB;P(A{H^[ Ie_NcW!4<03mTjZx)fV0br ى<6<6<6I:]`3/1eRi"4ha1Bslc C!fH ^ wEZ,} *p[clc{]ikH|1̳y5NImu+]c_K3E AmrueLo*K1oЬ:%&޳vj׼g[+>mr|Xp{UHHj*2ȴZʬDӬbEgQ*]?nVwE\qp>3ӫ"GEŷMb׹SO伎goZO2öGOAkKut5s?2Xɍ"):# -'Ѥ[oVj󦜒Jx@I2k"Dđ` T?Ґ0q3LZLPuܧ, ͨ?N:a~TiVEV賊G1-H{IEn=|2<8"ŔӎN%خ1E "\JlBnx$0Qc(,<a=F5\1r')C3;=>tJ.*IJZuj>y#-(X@<-2$e<*0;`64 ##&h4H/:|d t͡{HkHitEj* J3K`/%c|sp˥|'ms~X'!7G[ &SR$lbXA02SDNob wj&pm)80Y(_\tb !B!}w1( .g}(󊏟}d%Ϧ7+E3"@;:&No^ 8~ k,!i ]RNo|wl1T-rb]˫T/xQ6v*ywes ChL gJfTOCy4 ч(Q'hE %'9c/Wt8+AZdլU*1[IJIsP. };{L>ŽQQdA9 v?,\ _w߽>?~?\`.Ňoa80V@$ @ݏarjXjg _we׸\)ןTiffaREi?6_ܮjI͆FД *y)S p>w! 0@վpވ~yS1ƣ1*xZa-Cau̖T6PҊ| rH%} `~Qu0, *0wHC̮Nj\+?vP1dvTG(VLᨨ (WkmEFócPݑf};${(!P(+,|82ɴ2( B"$HPCI쨾S7FӉTs[D3?է|'/lE͊;^)K*0A@ 4"oۿ}If9 U9-25cK2o+L+ww{7n+'4 ) `)@^h !f?DoG>09ѺzIE+=4fKӨCLFPvMͲ83-c1KI\6x LqY̘C:,N,_# /osQ%B6~ +s*5ng?4UIl V~L`0]ik 3=V7҅e]6WͱBUۖ{XP@PjRCZERo# .yp .ҥޒ*,xI{AZC[|R& 磬}^έr[:?^Cp! x$ )[X1c2bDk45[!-M':]pEδv = IMx,@_vGEgO> trroD5ǕX0>Wq=mM}4o^m!g϶T+ٚP2zEiW`rK%bdF/cL+l^p5([@l3I(*nҴrE_Be[~\vk>EW|k- S-0 nkNf#jy9^Wb5=O!= oV`h0,:igSl/4-؝xg,D<:=R%(lA8">SZi5<rQ٪lsyGҢʚZT+ ar)Oˮu_ɲP(,{] yNxwa]6LwmT`6;@/ũ@j[/DJ:e2b';WPSɲ$ 1Vzd<=j+?P u(.NuH;S.l^$Ѻ.`[ː~ uoҗU/E":~)cB>KYq_fiܧ:+@O嵼*zs멯0=G)N ѾW!/ D0s>u%g!\sFU-׀q/)p{7vi2a={}wmqtc_&#zwt>8_}#NN$P+TCWk+ d./ Y3bs#b QiSRr%qLc#۳#iJJpp^R:Ze!K˙eFHg8g&Z }ι{7ZOۺwj&D 氧\2eW )7lR]>f~c Nl!*'\ѹr9 "jC qB uƝq+% DEVr;Ij(K-7#[-J $LJ%בyrV0L#HG-d ҙAPI(a@## FT8G"W BJ(2vLg%t Q!*\~ ZjCQ"M  [ep4` G KYYO] F݂OHiIN A(RXbpH1&[gi.}ƒUT-7>%ǽt^~h]N}7zW,֎D9!E  V*BrAgǠasw1 ݑlq\xGn`qeN䃩xwŘ~;Jޭ֤i{1V)c2JFsU`.4 S[w8|Oed+9tħׂ< xGwAD Dyb 0XoYnL0LD<#2V= Cz0'|3T[ܒ[ޘJ8vc$,-jtZv6[KDO[%$h3Ve](Mtu.lW AucǧJfkL 0['>L.kNiI5eI%>|)I{̧5UW[bV˖˯Xxc8>p߀6| ?^N?LzsNe7ĞD@erciܯ' HN@3GWuY4{I%Q{IYT Jpįdq_Q=G QlKWZfuYiil~ h04\`D$4ߪoـV©nDpGll؁b[ԏJ%51\8B ഏYN^Xm<2ofRm 4.%T8 yhKI 3R0MDK9 QƑgSxd J͓Io$<4~:`L>'j[OB-ٵypc[,Swޕ"#$3Bq-dVF)0R<*=Xa'Nwaռڝ.4[v8zfБѲb C~gHvnx!)i@CcA(,ǜ r$(AơbKkTNj?2cB)7j<[֑aYqGkZ )$÷H`?xMLj9+j$~ooSSZm6:{u|܄l )m>!Ki0fp_A&Pn_dn7:85TYS|qSs%YkIk9]y LhK^񚕨C[mUk!{ULdyk]^-ڦ+^äjmdʙvV^% I[+V:O{BYs70?tY`җ~o"D|2ܭCnz6]J;60$5AZX,_vGEwo>RX--)J1,,6).Q=kAw7FxĘvFkBvZB%CSŸěU`̙@K=u%hІ2͹[~EH ǟeZbP'ZK(I| yIEt$LEt!"'U<` @/e_E.qVq@Ma rY D#ȁf1]PKOc=  #$$5pFBb%#1)DXҠu0I̥.hiSZOi=Ҟ5(:l,\Ϭ1U}d<}l_N.?)/ {9ħ7>.TO_xU| ~U[3}7|}Ow`N~@qJ$w\ۯ:ǝl0(N&]2p ve+˜ʠ.BAT~` (tqzR._/\OYTۯ _-T0jEUW*!ѴdWAoȼЗ r"l̾QS&˓lz?em*Yz\3/1#TՐwPQ% TD#! Ã#QL)88Yf3u+¥&LIViǬQFYJy>#kY#:N$%?yqnդԆײErwH He"&V O- I%Gdk 냌5@;hT| ӔPɥ#+RE Rr"D% 3âҌ;$(" (O OIn_z, wфzP"M"O_lpBIlS[h/{ (O1gX2 ς/#ccVhAXAW#@+b쬒V(b:t\r0Zw i'n ]õ"Syfl]*?cqI߻jR80|0HHa?$'.{ktS"q/W3YyR])\]vx I^wO\Zz>8Q| tˢpr ~7o~M|믾yw~7~ROY1P}:CG[Kְ_5UJu o[ r2@dJgjL]S*.MA^`g  YU -| 澔{p`! t&7)~Ţ^9R6Vi]x:[RqC +E3%*e(`~sd *0wHͮNjFgLo'1?Q1dvTG(VLᨨ (WkmEFõި{tH 'P Q4VX@q41'@-L ($BG` #A]} 4wGo1N9FPKb+DKʷ$T_h(j z.Θ'7nx?{WHQ/{Y}0y؇ٝv0xVKܒ`fJJJ)-QMUeQL2"ϦgW2͇`1}d`~\+733v}Όn|,B-d㽞<10Ž-;4Exɏa^}ȔV0#(l6޴JfM0&@؟2[WĞhs {`;8b6ǚE1+<􎉅q?_XI4 j*en ft ~6^zSA Z*ĩ; ;24hSKI1 )8I JvC+-<ɻVÇDEs7HoN7 ׺x}ЇF0UXֆyZڀtN [n1Ǫq{mHEҒ)E -&03`^ ~Jbt liaWbeψ62+(-RNڎa q@eu{od\>guTp`JAEŴ`RS- eV"i KK@m9FIEZ*px{}.`YgyF,B0#Bs`"RSFDD b FрG!eLD:[R 4t}m}˿iu6Vw2_柚1-+:`>f?2:y'ht njM̨3hR :}rFN&Wj}opM]ɻ)}lrm *PsA{zޥ޻w]nPBtRgҊ[Ki. SX`T,(_P.}4Uك`pduLLC4W`dr*H]S7^-aR57 N1{pms#bH+Pݰ1,o U0)PҠ-hE I<K˙eFH8H3tazI%$ /EԝcUk@ A p6H"x,8HpPC1q#J+so)Mr+jVIp[-7`ڼh"pdAƧXK#"x`F8d3`FX6!hQ\1 A*!lhYβMg]9 Ǘ^#Y*D [eI*xuD *RgfErώd #!jPLbLj54HRx8;r%?.~yˆX;&X@!qٵR$hjy J)]DW(ZCikG +qT[d2JFsU``/4 ]-_Dk=" ,|FNCK w{Wwe +a[+<g1f29Ƌx1% ґC4t^2̨: !բ+ oHH(UgJRH7]xWaj*gRfτ8zdc֒y+*=l'/V9L]X=~ZӮ|[9Z*/&gQ}Rq]*w?ُ1Z8*&E 9Q%yQIWF ?AZp~ g/ɪHV%@*U]@ P^0@y{'{LGUXdN9pZ N﵌FMFS)+[^?O8Y=A}B]u`2[lfO?@pn1hgJۣv$Nz8iE\$Nm z4)tY4.+]]5v]ٝou5k02U˭n[gWwoT(wFn[|RkCxܯaweh`ۚ[%k.^1t=f0z:(ߣ^P$H4_FtryLfW)XΥJiRK{^W),-Ƙ( 4UhŬ* 4J@TTS,K߉^RKJ߉w(}'J߉w(}'J߉┝)SNJa´Kt4NۅFYJUJ`i֜mCv1"Έ ,ծb֢J 4(U"F%b3q?v-"TZ 9 rUIӠd4iܨM>~L՚[\_/uOfluJۇ׫[I[߯E@.˾1mO9ֳuK9xƑ>E uyqK.+rS"Kp3"اj KII$5)XZYh6䐛tYm%]3ܧfjh3JՂktQ\^\w Ysͳ.#Wg"'U`3^~z0+DZ]zёl'/>LmdMS%ixt8N?!w sJeaf<ޯ#z::3֑t[xӄځGRYAb0`&-Ŕ(2kؤE^KS bk*u4Iڕn?xdʕYQrp<8}ry¯E'<ɖj xҗ53U<$..g*tJnKM&㦽VJoJ'|L][^/8̟ԿE^{l6W(7ĬdjA(oWo x5=vP*::e/.g wӆ?wxZZ}{"daݍ%իuBX:Oy7bEwBctf=¡}{Bd-cSQqئ] $$(5+ةu׆#lΦ\>Xg^uF4H@8,I_`O-*U ݗEVD=YR ~;sþ%~ջrJxsit-pD;1's*;q,ah^ yɾpFN|4G8hZWꄻѩȻ]W:ks^ŎyN~h ۘg\+"tv8P>%5<@yFQ+˵cK-xi{`؟ 1M|/aR{(^{Υ/Ox.7OqY'=j6ڠe<#j/<ت="KF=X!)^R &C"iJJpp^Ԫ&ZA۠FH"Ici9I11~NL:tCe'/pgrZt.P /;'/fS&z*8SP|nl~|#O&̈́ҡ$:& "& p@:'PG ^kᄱR@DP*k)Mr`]9rC 1h"pdA'= &5\GEY0(Be Y̠h$yVLZ"$"MTR#*#+TT!H% M3Y Z~1[la8, zP&t [ep4` G JYY"3S 0KZmRBVRImVFEKpvS+Kb\oݦ8,֎$9!E8<mAT(B-AI"-NݬKt%+l}k6$<79ԙ|sԱY뀦A&n x1mMa吵`X="J0F* k4^RiKEED`n_N]b}Og> ,He +a[+<g1f29Ƌx1VKltpW:Zzo,eVgF-sB0,qΖ̬s΍7.ś 2HH(۰% %^lB<{×||Q~ܷ )'AKW&NULQ\ipe"St?!H>EdئN'k;˟w x$ )7.o30{-#cӡX,klPb> &6O| MW]ri&гA= 0'o ބɸ{d[oqԩB+"8SA>mS&FP0 G0ˉ 'VZӌ[꼍"AYV)}v>A'N*Ť{}QITkFS[f3'ۿaW1H"%RgZX)₀U$Q(Jq42V@-›Ex[5.>ԮRO*[w9ie[ъ-moN4ސt?EA(ТNJ *$(Aơx152UO5x wH9DŽRn&x92J#"( -NP.k%)$7Ǎž{GSxK,~VE8N~9/,s?n̍bTɨT vy~5ޮ288boJcF%$+`)$Ť؛W5'E%34vyRm-fF;#^8c$HDzic8o-`k U cJ#68R$#7g)Oc.*ul[6%EnIjX 9bͯeoC>H 7 Oq[]7{8L֑&XԬޑq\B_^iCQrZM\#NPh )HG%].E\w-љQ9޼w^d"ID!' _dsE郣ˣˣ䵙G}Ɠ&L}o@ށ\LJIͫf754fI,f&Q Ê*}wg:K;1ttZVG/<3 )T?XжC޵߭c KhRж PlXHuoG/ GQw~hy7_ F jofhZ S8{wE O9l'˩{˓?oSDo\r*5J-_q]^e1t#+D&1beP٩ǩS}<8JX$witq\(wp)B5>.j|sWL#,8MŹisS@rd?^V1RHG/è{b݇qQ\>vhqk40Ns_c)=l>{:7)X?Y2^idK6$ M*[<V4|L3(8R'*nZeݗii؃-Ihc7aT~*GJ@ ߇Ot\>sKҞlQ5n>hhu(谁ηV{,Ы CۃJntC2^K+F= f~jn64 wh]\v}_1*7YS܋˕~3:{;#Է|yH@qX؁=w6rH&]4WR}:T[E_O^ݚAk}2$Ed(dHgV`.~HTE@,{>d+2g߉/\O9RemAE}peĦQ+ds?йye&YJ)6:ABC=`th}A\'4D_JXE:j z)Kd}O.xaz2OpnhBx_bIMOlᐑAX0e$< k$dhw^R7p;ϽNם@P ݣ o< 2ɵ?x}7__ۿ}}DHqQy{ELt7NG*@J)(Ԓ2ha(YN^X߆zN? Ha\TTXAI!8b(jNES$xSyon_PJ"u{o`4Lǁ! רEo>RQB9Uaa~낎[A/l8vu%Ϊ }ʅ].ڶn Oi)Ei繫xM(֢tP,S\-ӫL9A Ɯ\ })!s׎ܵQZlm)[7V&w;E_(p88LgY,mhrXA}btvLY?xť/]'wk/'_o[-8dJP=8в4owiۇHJbH8[/NZrG3첉sH De"&V C\fԿLSac?2bHCn90I`ZΧ)fM6`AHnHit`ɜRafX;$8"` (O [9YLX¢gpf]a4@#”3 Eo) kS _ds] B FWY%P!t aԪw tSH2R[ς; *o`#yu6 .T.?ס"ͥ\Y1OB *.ָ kק)3-]siȥiW{>=$4=Ś3BSP; F-E_HArf: Wvxs<:C].ښ+*P&\h.*LWU7;fF牊¨b,~^B绱ICS*H!צ$ (d_6-$dQI(edpzvz lߴ4.OvVKAW\$)UGŝ5-.Y~[^W ggنPr2W(5lJHm|Vz?4%pa!y$:GjCy0чѸQŤǣM혃1:*A,mԶJVxz6C#i`ĥS_Ha'z9IzOh⧺n.O?~۳WM?s7?>瘨󳿜 h?fUtm =C{LvCGk0R]zq7ʱҟ~3SrnΚi6j/NIbs |vִڜlF}FWh_ǥLQ`C<h 6Wc09c#ݵ&BHeYْ9Uc,JTQuat/_@ )6ƾAF=iٕIMn O2,fNS#ǐ Q E0Z12D^qy^ m٩ёF#$H}4@) NY&A1QHq°g "ΛPlBMl~'ꭇF:;cXD2ڄ[uV[QEt+G:rޚs{̩OjɄX|)˻׳Gfp_vu%7iQQKf-D~Ze:ߠ!l`JhQ bDI)1vpo}p#KKk S^!|=b#N#1[Àob`ȹ [7(K9lVJHH:6:vh] =3Q<δh]ҽ _ܙV.x?\[Kѳ+g̦>"SsyN5ϩ9j `*7Xx2nIo󡯤 p/>^Q1:*r\w W]rOQK8'}P+}eA5f]'.m`_ǩxq4S{yjXK^ դŮ[2v{*SĦ\/ty^dI!' e®QQ}h*t6xr߄9}oށ8'a ͫf+umfV. 3jfaiВfg39:- TVҒxKv\NȞh+{罗AwSEM **OV++\X ix/ZQ?\#=f7޾Wf,d&#QHYu2LwƦ",`ZFL&Z i袶vt((\XQr0H0yDo`5i"RnI'at8¡*Ű[)rov6,ڪ.\fӔGM`Z~}[Jw[~fi@s5aaaFҀ5R$+Ds>H*)E2! 堅 B0 Dz i i iDE#X}|.ts*Fy'L#\y5;x"Niũ+K_VWқaoܰDzvE?~MEv˼ph2J\؁{b7?& cM~fJ띕 JᔻwۅF̢KoGz~2@/$elTwmDSVj8e0 ¸?"I-2lUNQUBµ'˭qVᘓұMbo]1_{]^HC/7{ \PqS۴f1|wY@@<xwWCT 0 4* Ҕ|hU t)Bx*b]̄Sk]Doo]|Hļs6QX> ڨ"ٔ阸Ʒ: #+륄?eo Gm#NG+9ӚRd: F-Q;{yj $ÝL%PI6:HWGNެѪ7ANVyBxԄ QQ*-O4$Vj 6 "ک4d5wZD/mm4ΫF*t;,o}dɖ2AF#͉22o2+=1,lZZ}y˨ePr:(5K!*WG$!xK\gK Op+g ԡTwW̅A{~Wӎ;S!N'ՙ$}Rg&,LI:/ p"pAhUPk-WYr ՐKՌ ~un8Np^ RPO%N$bT ,RjU&xLdVī+Kn ZMynIZ[*Tـy.leEqܰF !܁E~U[ ^:j&iQK20IQm<vбlALa n9nY$D^((lr`A#0 !'jc4`ʈ`~i!jEP3WXUty"XʴVz@Ɛ#a7 $0Y,qs3cz9*d>.Wˬ=B.osKMptwe4>=.pc݇[WG(ۖwwO>ﷲ+#heŷ={gUJ<ѐ@#t.Cunǐc^oY}g?tO5DJ ks 8|I!qPRjz'kh|wvvQ$u'c}{x`YŃ{Q" 5 ) f|L.2Vq<9#3@sj yde$$͸u~(͂[?DLgcVK2__`xgU u%$+3f&pcZs.ARQsy[w_%;kq:>6S攝)SV jQ]Ӣ[M_A-\{"ElQ-VW?C]EMeRj*RSdr;1j'FjM]kWK7xۍ ˠh0@S~ta@b>,4OGE8^Gߛ-E׫[v.+b ~C|$7Ÿ'j7 |v7>#Fv9ss̊.ki9K0 Y#id-Qk%BIRfQLBA쀹ku|T pGLlpxp~%6eǿ2Aff逹Y7nՇAqw00`/7L֫)tPO2ispm W0=xYRr֎{^b5m|Yr[f"^GWNn<.6("|E_.r7,^I{$.'1q?g,"68pL]LS*"D vs|64>l7֩/kܳd<$n.Itc:Zgeف4"մFKoLXgJJ*O@I [(!-Wj%?+$| ZEs#W\( >XD} gRTĜd: m,Q;r9e%U#cM!Q<^ɖ#E!jK"qg&}V6s5VWƱ#U;;P><):QU9K_ QrxQq U;gos6y!oZvp{^#7'o2Vop{ *GIԢ@|xЌ$ŕey17L saLVA:#URH @S@HbҎ1cApC,Y# )B Kf)}9uʐ 5Ƅ==JSt7+sj戨= : [,[N'zr].n|H><Ŵ)~A4Q`)kU,2u)4B{%"%VI VҔk.H]QWE\¿EZ 箮7uՕQzA,/]qur]])WLlSWoH]r|_Ayv7qGߖ}pCa| 5;9z퍆ictU컫!z0YV>*MɷhyxwX' >  r_ Bt2=uᡳlc#=@ -n엇7&8DkΕ MCэ8SZc~2+*?_ @LpVb&% 8>ɲ'=٣eI 2@-[|do}45 hP`#x4E\":KEJ:/E0f uU6bUURW(eg+DM]ueXE`}9E\#/E]Y&:)mSWF]=xOQǶ{' lk鋮#iޤ&幽OPW>z$uUVbUWå"O2WWEJkz Uµ](z$uewt~[I ;Ƴ@YpA>&>ze2wTJx ǔ4!K]צl߾F"yH|~ӻ`LkӻA[Hiyq Osm;tKz^!$$ߒ[|KoI-I%ɷ$7$/$J^VmU[VUoU[Ǝd,** ߪ* ߪ* +o8t%ɹ󤀍C}7RкjL!( kF_19F\Ha 0e[_g7D7ZCY5d?j:~3oߜJf@83|4Yl ^_MiYEet1{OK4e8")1[ϣK N1 .#B Z_CמZJTm 'q/0NOE?@`0'ӦfqH"%,EN\AW5Qg&*iЦXsyhV!,K]F2@RƔA!"!1Ѡx&(Beyl@؃CW1kʻ]oW~i5m4x&E"þɒ+wYeٔd;3,RofE @)8Iiʣ,eA#@ ,;ME~ i vHI;i ;P <>B VI8u{tW! {` G ]`kzBϐ` RxeT*ej{ųLCųϳmtu0?gX`QBhc .ִX.HGϤ:H~i0wU`+ksNx=c2OgX=nƀN[5WID){r΢uxF}XIȅƅwϹI8Nђ(:茜=àV|ޛDL8Z$\˩dsy>E6=!},OAr$Gr2o9 űhd2(v6^X\`%r!$x?(;jrd%"J) aJ6T82X)(P@q%@ ;=U+Bƭ*MVh7/ɤW*ɱuL2.'̴ R22ԙȩ* jGavsRM{%"ԂʊoW:c:!e-_,fJA jFFoy!+9=>"$2BrtyyחȎ_!/)3_~ΟqP1I6gm6}4M̖T1Ki$Hjм6 nT`}7Ix8 $bf7ˠ-Y?weUR`h$A6p;̐&5$fHi6BHl{uHGI} "p +2- FRF҄hLP0Q<$:BurU۪'9FȞnF./!TI{VE5Nts#p=Nx(ze&.d*/;Z:*bB8 <+H ;:zW+qsjG 攈A$HNh"H`ДL"Eu/jsSOG>Rk}P 4 z:10 EhH&MTBb /-9Ey444d(}Vaխ@Nz 0J ,mPSR$*ɭ5sۼ8,\,%Fe<.a4e-sheE uBr66.rw7m~n50}s2!ǽOc Ԭlծ}0.LqrZ zqzi+^s-Yem`~zGzF[uq.PS-ObcI)@3VQ,ד.xڨ% #VNbaE09 ʪޗ:Ș;z렺u P~hq<r]&Tkj^#tTǁThN ,Ѳ.6pMnUXSB8gշNms (k@YZn]+w2ZW:bڰS Vd*erJ_ȾWTZ5T +qN>8-i\?d)PPIaq8!zP-Ry[~(|a,Nw9 b2wm%f>_B#tĽhDG?*Mu%x>&Lj%傩DU p&q.YӍu}KepsЎ\Mfڢ= %vT<*ӿn}lAnz)7TUQhH$Ҙaڨ5gE3%C)җK&' ~Mﴴī9bD.+ õDf;"Zlyǖ~,v d9sò$EΏ,XPPlpDS K﫽P;Y"I*k< ?Ȏm{\Cv>r.a{Ln@-68\߲Oko\4E>֏PTwMc޿pO0|uf}I{}7EZk7{R o=b(lbسz-5긞ɼ-TZS֙.ɀkth_oW<˛pۑW .LxI5($.J)b܈਒&v۩`ϑ/0!$T,3xVd.%{VaS#o^7 mX 9Cat *,X&4^X*S-+^ZS!d\3ϿZйz#i5QcU4Ol.|d{Uz0V#OFN!y/a)TtۋC{Z !*+x$D49X[B,c!ԡ)ͩu݋+rv̼ .q=渿knL'`~+UقVR|R(I A%(,"'`$昐!ZhS`7+7Hng%T%xs*t܂! 7P9o"Z$hW* S9s$9g ҄%,3%\4dQ'3r\c/kQs IO*} >LP8M8p@&r}&(r٧uI\+I@:k)F:&"jub't\CurV SiaK8#CWCUrd=H{Wɑ%晴t0T"Ym{8"5#1♓HxujdBxC-'OM|,NH]B[U>5vFΞܖ{sz:hm-3v'.c歵dܻǀ;9UBNX#k5cZ ?zQيIΜ "̿.ݩ7Lyd࿐2۞dV滋ٺ2:wۖw7w[ތvv~f[kí~M~]%Ww j6]K&wUMaC ^rP3NDҨJ8%=8 $h]n$'!`@Q>pi)PHB #T5s *Vh-pƮ[yrsV__ݠ}R}t#SD\U1T2;P{/,W.$I0CkarS"@3+L۰M\EvLkS@,´:iCX!5RYPKcPȸ0eϏHwHݽ@d K\U I |o@JA \Hf /oaRMC©A1) h sJ)1Z!J;T[AzVHW^rtFvl_ߧ<.{g\shCGf `I̻!ߔv}}0ϟ\+)Ŋ6DDmUj3j)(krvG(+?=+ ^QRp!?+|;`{G5doMrA&rYr${^߯jɒF-k,ʒF0""UZOHt6#P([ qA^g.ÛL>4! z}ܻ"iV W쯦Zzuvsu}QŤ$_ ߪ9Ԅ$[hBJ՜Ҝֽi͈uM|w{s|f!#\Oa/.k.l!  \|qqY? bHWmðahfYޡ 8: W0bl'wc7&uTF6:dۨmsUdz2LF]NP^?/io4=JaI7t*'9?>%_~oC2}oo߽_pnGNw&ᷝ_ o~~hfCs [ uY7 :66wW;}?q5#sGO >o֧0zw'aWlk'/m5*"ZVrw BB4b}iw_ }e4m|U̖T5J$sEiV}y1 ~v0o C:$xbgך \4] )S5 qxnhN3AӤ$MB 'MPΩ{ݑ&<¼#OD"2DNR^",S`dTQ heJt>!Q%x*O$CN>ʤs [&ފro\5<ϋy> g(^1G@m@k^?_S&_w Fϻl e_QY];q}ӽQ; ٍ4F0ޠ\ԔIv=JEZ.3{u`΋Wmbx:?&mD-袭5!>['M帡Ss1]u ⳹m!ruCۯkݛvy? 0{Q1~vTvjM ==%,x#+.uY*> Y8 {P<1U(Cs"XP0J% TY*B`rV4AG= `3xQ_{5%!JʘB$pX⩣FksN?Ыtb˭j#&<)mxS 6Lp/JU\@>ڪF j 9~TMbH.{yQ A KH!UrY*1uhh4o2C.U t)-{%(at\`R3BDN H9o!ZDpuY;.@)E/|>Q-7]=lrǫI!<(l6raNV.LP\99G7 B\8l :%*I% /a៊)kKZ*qswK[JOnY_`yڠ?t4  wqKpb&؞hS}D02WXq;=Y'$aIHeӂl42a) x"^;.$GL- N:{B2f4%D*@ENRQ5D%U-N:I$mv(X$6Oצ3.@F* { }{1%<ҁ voO9Vs*FVƆT1bM3\gEá}駦:ͺ;8}Wʖlԁ :L,QnXRE/3@`)#Qsb[!gIp'ggT8a9c9 gIdN$mRAH-0 A{'HYQ9$Yg|[Rp=KBIǨ.rb /,gRΎFۭ.FRUGT6 h# NV踎!$dJ;b"*判J+e7)$+I 5|81.mGA-38&tXvR5dG!@,W 5 w"El𞨣<])84W1x"!"L;=q+\82ιttcK10'i:i+#墎q=9+">SX̬F *< Gm>0+\0蝰6ZL6r1cGjyKoϜYG>5uc\;"76յ@hs׹z>e^[Wغn]M[os8y77f!,3nZޯzk|Md=/hC=9o[F]Ϧ{:Ft5VG(0 N5gӟ5ֹ]-vWѧƎzVMZ>Trf?&teSn`E NZY'y4IHR ( Vy1^\$*,J7T$52J1hښ"uIg--U1%(Tɓ҇ȹ_' }Omg'+_s\SL]v,YJ.dnaOtYɿ&ɼ:i%8G*qa4#/xԺ#ɏΏܥnh< P?TEV+8>_jn\כ3/~r|\{s_gw-M y7v]n>'j}j6=Z1f>Z1(0 NnRnĥTL$АwfgE|Q娀լ`䤓wF2;pYw"WgN(Y(6: S KokJlJ#Ljiw8 ޞBֶ24ZbzrǤV;M-w`z)4tg)T=Ӕaa&ȅ{&1G\Xn]ўg:cayIjE.,V7 ۄ턷1GLl9q`?oyCk7NY`3%IԿ8lx g6PN EM o{@.;(e`vьO  ԉ*⩳D YH5xK! ߛ909&1%-ʺek)^WucBjϩzo緷W?vURWB\ܔ(Ε U\菈Em+'AQqʺ] /}PR2/*+ƥ \5B1HCJLP҆Fh4moL8+ n8" 6l/G͗CMgSB<$Y,8B Qt@8K#^|xԉ|tL>C<^ΕJ:صw_bɤRD`ZTDHbQVVk)"m `uD"8OAv58Zz?zF$Svx^8㡝1o?wQGd6R.H /ڮ 2&x«zm&L"ET֢Ͽ~xƭzuOn0 9g>WqlFsҚ<|J9g9iF]UqɝjwJD~@u唓H]U*sQWUZ{}UJGz7ꊶ|t)47]_VW]J\+qEqu:) OK]+ujW/I>Tי6WX'_&.~Ͽ?>];luM`.lDԾ!BT]TjㅳN"JᒵIJ*!ʔleAD!?}:@N}źG: >WEWVi ?N4Ȅf0c>n7 [_y1W7۔{:~۔za6AZT>Bvʁ _ \XK; sH1E!=КBU O((9W85P9MJ, !='7WRXII )21&ZBG @אLdP~C :*}@`@0`C-K1z@ik˜MbMr!wdkZ3F|)Lh*XgcfLSBGs=PDw> xNۆ' BŶs[Gt{ t H^'ݔZ⺳giաϬRmB#C~4Dᾍ`8b2Bg5Z*; :}؈Kk Ybޟٷb),X Iڙ!J$ioD.E74ɖ&(WpJS@VӤEN9^T'QoXSQՌ9evS+NA%d'ALsJ2"L?J RZa%* DrR8I L"cҾ%h '0 hO-& CO& ;חےWWAockʬ*XOz~;^Ɗ/+zks׳y8hsJجSȺ@ IldQzTLNG1Agl;䶆fq!ٕTAyRD *8U&3On3VFBIdc$*μ( h'$O,-5sC!p`1N$l9u(Öq֚8[lf2k&j-XZ`H1$VY`36T\dpAM|y/GEj_s՚|ݤЀ: KdwNSR"NTNvtT5G4eg'v ]$]ADZPTxeHSĔ5 U5 AnwqRu[;UɶY]w^x3f%AFVSBX!)l&cEH)fʘ * <9A'lٳfEDR<ڃǜ5T J%~MúFn{/wMqf1T3 ϏKW:DD9$t&Gr6}gXm$DېE8g \ەTh&T 35 XfUr*lSa-hN9kr9kU؆J3'YO|M,)Kln]O Z֎V'=+XR'?6' YwpVL[E߆ii#kY솮]F>WrR9# Va?1]n|%B뻮%boeq[7zu3i{pC67׎Cnpf:[v>Χ[07{|m|W3#&1/O咊{okԜ=ⓣ,y:?~/jj?:H9 }SWKק)~ovOI34Fni8aZAagvv;'a@Ҋ:^YD)hBQB畊5_0)Xⲹ ?/H`H )8$, MBGa/µl&Ζ < ;%}7v^_B<6l;6-yt3iǿ% Ce#zR`M3hZlX&ڢrҲ*S8*tvR$g$oSZ6Pl"KcTT%[g3_Y|/[_r|٦β7xf.M#^Y F7R%u(Hu0|wx^JU\욢j6}oÊi!yY&N9'Msվ!Iv 'Mč4TPǔ"L|Yp%cC 2J4A*)"OGh7u j ś-\P׵{C[lvY-Q~}YO6|Q6P4Ÿ(u=j;2Qi.PM7A"A ]>ܟd)g/R)Ϋ`ZFwR9hS))hiNЪyGg %tBJ5^o>zkg>oDO5#rc鍋cgɘs@-6_ m.kcfr=67Z ^S#GgA[#f)v F)gj>'<ŔQ[3U<2[K6m$Ԑ &B֤=[,U}zTlGHjwMZ?ٚۉuG:._0AfMrSj ჏ufxׇ[.;㽧Kc}Y<F$ Œ1Y5R.ecnN Bqym4 /p?o+O5{޵uB@"8h 6)Hl+E=xH\}Ēx{ٙ4p.:mqkrT\ތ\ wu,E0&1V燑Lo>;ug]Ɣ^%XW_gZ nJ _ZM˦ݟrj[W`BJos[i,V=yW>vzUz-!j-f\a/~m6Ն?&E>|}e+n-#] CڇuyfypT(Xɒf'cUcuTVLmԶJdz3B##i`إo'b/툟ᗸ7N;1 NįUga7Wo_y6ۋ˻o߾D]Xip`^@$m'ځGC͇.km0R9g]/KXs\XƕLoo?;obL )L:^Ijs{ v]]Ozm5"ZV/R]R(}4n,@1VH))-bQ=W&e؜<6` ``쏑YYT`#])ʝOc3a43P1~NS#ǐ Q E0Z12D^qy^ :wGMX1y^,>P v+e eidPDH)' p.*!Q[L̇5Ɇm#FY-L:\-]a(VJ?Kf';g!u%7iQ3XRZe ?/*gArx\HsHEÔWG 1wسı/jc(SZ9:TC{Jܲv@Ї ޠh<-[QQe+EǸ..q X0taE$Xp,8\ l*=9S-H|=LLryå @3@cĜE NV G<`T.b/_v;𥖷7%+{°7vP~C{mXRww9^}n~~> [smmZtb 8'kIP> aտ/S"2g,=9V\W/| ,|2<>~W J] L면 )P>҂xKv\erY]a=r,Ơ B(=HT'M=7[|3(ODkwR_a{ AHRVa S띱Vc&yeĀ屉hj4BZ"r[']2.k(dQr0 #<"a/R8hM"eih$1T IMRp= Q;`K)-NS{Hܙs^DܗN&w7wđONp:+*[+X4VON-?lrmⲧڤa햤ʡMV>|3%yG1582-!D`y8Aa֪eiGQ޴B}}Ӈ S\-&x#32x5sf"(ܚ195jr]3Յ4..ܫ.Q6M-in0UO/~ՠ0Ӡ?~;K &xѨN0+D{CJi]I'[l9i_&!a iU3p`\ > 'H @ !]E=rAiL {}X'c_iLA.J@Ygm\Yi3 z2Dy0:>( dP6R%³eZ7}I޶"ϖ&‚-*uCuҤ'?yQ^R0@OdU^у3few7HM!)W =~Lr Dp$U"WSQWZ%]]%*uFuť,xB JVNA]%jѫD` Օ{Uv&fwffi^BUqIGu4 b}\*MNƹ`1X1 ]sk˾;MLヿ02L^ɼ}49Pqƨìʌ>FhMosk7Yj1Be=p~ͫKE$"_h@SR'%+ߗ@ goZ:Rj f(HM0&wy !ֽov[$Ga'$?ed;O#Wê'Q1?zն ĩ)XĔUoI><Hg$ |ћvQ5 4]6\Q4H&t"/s'XDFqaeDe ^VeeybwCJ,YK1 1-7/~kMx%ݞL_"L_"WS1}@-M_3}_#c?I/Lǧ9]2(Q_0baV`m+ N/K8gԊso^%ԡhiOv<ՒyDʸ9V{VE*܈zI{@N c#YʢJM2b*P j4!K˙eFHLʢ Q%cmLJ K?- DP[W>T(KFIu9RAxD$SM<t$8Np,+% DEVNrInfw/O;9;rC I%c [%0 4@j.̋a:j!*gEC%DhV,3DF4QAJpDRQ@4e#gC9;d+?,m"փ4Go3`g"4 o)  } b0TJEu"S>~)_+ )X VI!T +Q )Ƥ6Zc22t,AlPMߎ 1"'bֲ-T(B-AIGlubQ094xb4✩ig#M̴5)G +J`8B(TF)h. RiKНn~|,EMl,§k@< xGwAD Dyb|kg,7&L&"x'cƎU6R:8+u_&$VgIIQ6)\ Wu8vJΎK[@"emR'}B b.l?'Oܧ !'AKW&NLQ\hpa"St?!HwTt1T[K,Qskm0`xb F'{ͽ {5%rL"'UC> ڹV#[3h욮k+y S}%\.nLBN~P@bxKO4кSj{XdM67Wf!,eSChnW-.ߠ繖a<y&TFz ^9N{l;l=;YR"wt3Df#3@'.ծ`֢BXEvqȯԏGGyuN{X6[ާw;,m- KGqb0L qb+h$ t569VV)r;{Gok %fG TL힌mAt`nʹiE/]is+~λ;zh3:(Kp3"ا"nLDI Rb%0l!7w( ?N BPF录-S$VLje>ÏmpgC":lwD2;pYw"WgN(Y(7:x S KojrdJ$6z͉dms$uy(;&-in\,߅>[XɞiJ_0«=P#50|h3m`eX^"ng{QefA=I&dnZMO杼PiX'"r}mklh_6PY+vi{tt$Pkv"X-ܥ^[ޚP n[*mߚ9f'F3h%ƣ_ ]ZIe@zY)|^N/j_?/Gs&<5D!p`:ox~!ֈ&w;YBEURʕUPWz ދ;#e$de@ ( 2fZsN8%#JR(IJ֡,8dA)~N.%%nuC AL9!\Ɗ<k,=lh+T)cm|*(gRzV֤آAz5#r;+ dkh|G03wl- !h7'#!AH01 %R$K)d% !Jm6{D.KR8%uQo \ Kv]bo D^g<}\^:#㻸*ɛ@L3!%EIU.A9 1Y 1Gy$xTqt6!u,,ﲄD(2G" m+2::k87$Iȝ$ 0 ScPĸG8&aV6-ZHe,`Ԫ X֊roZqȢQN!}a8TiUp ;T Y1 `W062NV:eP:4A[AZgggҶO?́v'a~~{q)ۗF:!>gQjB-ˆ?ț+9cO4<ߗ0L>_d{gze~=eTUk'cpř,IUܒ3PSLp%I>P dJJxGo.1 Mdb &dg[/ xZXΏ'>a?֑|׊d6K4/-o88x1=piwG%SR|?t 9M9avxG&O/T91L%H_3oߞMZg7#1pGãwJw1K"bOgE^pR1vHاu$om>ԭ,3[ĄZG Y,d<:Obpz):`:ȶQ۞p$ӓs)kbr <(|NՎkznJĿ A~xG~|{ݫo+?~{~{swoof` ~ ?_Db ߿ijho>jg~u]p\XՌ,>};NXrUsa;S8 +D6qa8-fÙ._DlR ǩV!0z@-.]Y[g}[}h~)]lNU'9+\Qߴ(e~ض0UVA~^E%K9٭:'n ֟ɻON>(ߜS-O,*EH䬲<[i0 %sA1ktxGzɣ#sMXբy#$[^!PyT^ˌrg0oBq gP,BmU=xbzXpla{RԖgY-V['6*ިll&gd7k0@sHNelob)OG y_B۴փtj(o}OG3?簡:S7l4C`r2yzѭ9T55m*R9*:@C: ʕUAWeE +գ!kd!2KNneu2;UGzXx (b:& e@P)8JH23*dct|LnŹ`bY_}Ë% t6J8+x0늡H6!a} .&}ݫQW߼ߏO>[9!qOXtf8lIOЇ,iORrtr ^ 5rMr,|6l"ʧZQݩaIu<5˼b|MYV")] ײt uJF{&Ud5/|8z|5\v1[7{lD: l! )W ,%g2'}Ittq݇?Rи:2т@oUb"DN$9r1+Ȩ26 յ`Nze첌924P  'CD.RV! p#HnPFzU,IZ]XM/iBxmqV?[]YsVY铕6niq!ڔ ;i)9E=t@[*)ewP9rUZP K_T.eFZ|k;|]dA'TH1{e@W)0d*f[n lһJoa/ 5[N.E|^jhxHkSHˁ uX&nn47bgڦ˽Lt䍟=4C:?``d/%OF.O8a è;)EM0IQLEҗ69 ?5;h73pBjXB~d:)M:2&k+t$/囹s֗Ef?7o ۠/K;`gKߙ'=~aд=k J/]{"b@88?9;]ZKۙam1(F]4huhE$K3ZUZ5C_Aߧp 7j;MD81Ŕf<@%]`w,jPk̶{D6;\VwF]ruUPsTWFp+"*Z+ꊨ5\l*T.++㻤 vwԕq+P,T٫gpbA"X 3ꪐخB-mWWJǮu+zgRla^PBWrIn{+պKm!uU^2ۮ N"j-sۮ KWWH]zW/ea~>XJj'Ry?~RR{3Z8g'Eb줺{-#s~[#_k?tS^߀'P }q߀o@cgHK0Bt;]=wt;]=ZRI{HӘJp7UM0I93TC"dsqNF kI^p=bACDAѣ[vDOx@,t,(uR"9I'l楩6\1%ɹeK͓Q9W`&8XsB4<7A<B3ҡ 9@ṕ;[+6Y9u>/Lth)cE62WUk 2?!o2gV);T4)FqqUhfR@"sDGr^P`BzJ HVۙVjN $\2gPk |JF`12ut`Pt*ge/ȭ*]r Y ',㘄EV1-%Nv,gr QOncT:+C"ɛ(+$l2Y -]f <'*e ̩^)?gkm#GE8/6/[yfg,0lf6QISl],jI)K\XMEvUbJXI!\IHhe1rU@ wtP9?.. 8$$BS/)͘&>*,܉h!|yW(:Nnϥ g18 |dEς>۴`-2\=AfZpDes2ǐb0`OtV#ݍ#}6ˋȡg  *< Gm>0+\0蝰6ZL6rcUKʓHG19P:)+3']7Vo I45 !׳Jص[4@"G`䂾KH0ד {QO_TJ]/ 3عzQV/"QD+ʼ@sZ E1Ec"s\:>W(vrحm6V8u298siG9[D60i'ȲgI*Fi&Fh> )K$)P5"2r‹ obgkΧ?o/nsئZm{ײ7yd~Nĥʼn8ٍlu!ͭF|>]>h|Lg{DGY$)+[N'xgv+v,۝d)@-] j+OiО+'\pcT'n.?Xf1q]6 &D#@'DLa۽9{cg}u;}?'np羼?ψxj:.^a㦹;-Y2pz#.44FC?w^}`bW oNd߀ǮRr{5L.P̹WTJUGrDŽa}DuzןHEABHH+VLZ2ih.]E"ЙVIz-pP eYF-&HLpiRm ps99\8scc I L*$ <#:j>1:ǔT&m1rv G h>rPWWy}$RⴤS1,YĆj!\Ʋ<"[Ike.H@i]9bg&ْzxPLsP@]04EWCG}tylcd1NYehP\&E.)Q+gs9 TLpu"B31RHp;m<8v\ pkwNQr1r̕ ̀LIW6`nۢy-u]/\;|]srT6QL|ZhY"j90\| D%`6C CH/:~P]k ,,o]F#a h,$7ZQP"EAIrz=UBƽ8unO8q*aº{&64r?@|oZ8@kŸ$_}/5S"Vc(+QVNjf&c:&e-?5,xMd$S'HekʵO&Ğ?aA6㒿w7О\v'fUF_И8&9|`OQ7S~_].')DMqzkd.Fc ;r.nŕm a:޸E=A%xs=%i68qzBDB.~oAn3$OXG6гax.?ޭmlpe|͛펊I# sB W+r I ӋwW\T9>V4kݛ6nFqsۻwl1 ~Ӡu=][nvQ6.+ջk;\, B6>#}Fz#~I/ho4=xǰS[:vN|Oӏ_)?]Oqch#gg< J ͇Z뒳-XD""FLQS Rpc4m^ je^GAh(:{l7/'x_.zo"7h{w˰ŝi49~ڦL-S{=#!u"'9h!3?VN׾`%ؔUvٲ-'cމWr ʆ)Rהf1I"^ J+ b@et܅3*V 56x";n k[e^|kr:b1J&ijb (w[ %Q"R69k5I,Qmp_~{(S2I =(E@MQưxC4O,EHD-sDCDHG4ǿ#-9 5PDp9/]9E4^3nqU;ΙJo%Del¢׷yrҤsak;7=59kd}ʻ3JHu p1eCLEM)՜:s='esKR9Rg2T=k 8l :%*I% TiXݚq;J9.,B^XNU.(seouKƗ;Hş6`02ד?Ȩ1NCF@Zi!dTSޮvm!vP{ )Vym͚" Vyhw3.ݴf{=hN#sjESNC@64 4J@B!VFϴsͳ8Fr] k6<|->?K1ūH.Ͳ8?囲1/߿k0Vn]Wx?O-WljлLL!㼮,N(3xF 儝\Zq|=O/NTk"7ZOHt6R{!6; _q@Ne3x(%AJՇ2!y!ޥd6J8ګ*"N]U&1MѮq8ntSzݽ9! Z/6S}|7l0~Ӡ|:܌BI]͹.wD{EOէ'!Wt kn=bQQ8yL>z|g?Yp6^kӳj*lz1Ź%?FW1WAFӄM6*_ӨBcX8sTǿwo.?w(7s8 G`}.@${^t_ZR]c9-(Գ 5677:\q"s?9y.:_UNu3{w'0#(6t˴?K54J0/p"!v#Mpcn+ze:UR n4x5Fͼl|x]݋+W*W ܝkڳzρ>dڛ mcC!{U7*.H JP1@\`V BF(54rgm:%*E!TIn } .wԴskrje~d<5ʻ,z7D՗^qk]hzwpyذe_ԨqXod׺O8ix5)jn}a8]ނb__O1*86U n6l5~ށfT^]vCDbm>@eCg( uvl'z<19C1L 8&.x뜡BD DS5vL(HyFQTmkmεAHs#{0(; QmX "e?S8Q8ι*'Ӛ/PL&?^ի8ƫ5QLOv ŸMXו%i|s$xN@4 cL[T%(sñBȨ@s$/+Ȯe=уC2$AQ-R؄ U׳`a1 ya,>(.$6ΫҧE^,7d0n  F#6 3G43V'r@򆓠,XTzp-/%G0dFs \i) :GDQ|Lk(%SJ͈Fð\iǮ- Pc4mx_}RJASiǴ4>Yj$F )>x;T̐M~% x"8|X ģS8v9̫~ۂcWDQv!ZmE%D!*ʜ&)M g\pDTQQ#x4[Mx8~a66uӒ]qQEb㨷.(V5:ċuʀ]]E4M7(-C)APJeq%Kɡ,4XKe)V.+9(%hTз8#t^B?I%V whh&s*a bgB."I6HOu A:k9 4qğ3&(/|\Q"\w̱{gʺU΀-MVLcxǐz5KG) /  ]&h"'2Z0+cz@v|hճU|$5ȑ(.@& 0k?9F*X݁sj-i[r?sfq˳ivCӳYx˥[rhk\]L7[|.]绫74g"Ѭ;+skf 7-ܼYZ-Z^s>\Fkڛ_]<6:U[9k|C8=]4#[z5NB5?u[vuxOüt%g{8W!;R?_CɞMrAB?%)R!)aU)IM$q9]SUOSjh]r1Ֆ[k?m.mnO?_>pGV)eF(n+S yk?vGa'cA FKpG8S X!g:9/{ 9kLHڅhp`I4Y`jDzM݋>tW7ccǕf0%VR.Z>Tv|5i?%?KAI+$& ɕWR $*0kR];䴳FFC4R9XV[\.EJ1&*yRzS9%x%q6SfWk9WSٮh-g,PC[~sXKdnFv$V"褕tWtĉьPoQ֎JH~tv>%8FA* WY!.Z]_xE3_N3ye |ɹfn_اu #4nƳvM2mOԊ1 Պ9FyZi(*H\@D 9>hf Jq&P8QJU s,PY |+ 9o!oq"epihá<8ik0jVte[\!`|dY W,XP1G&l2=sǞ_;e҃:QkW5mY[ʉQp8-8) R&iG=8J8):K\T B0)g&gК?xcky\ϺߙuQ/Y 9quj\H6:p.d^L:sbÐ@/=EՋIՋ;z{p*xx=E04@A{3*|v?rxoaGC墦L^::^șBxut򋗓F2:/d(&t֚ﭒiŨ涓c2i,H% Smw?&oDg;?n/Y5۹owƷh|_b?}ߙCGI:v9}vrcnp;z3|@^bF '7.1LSrjawW"vQVHGqf+I .@E#B NѧGBiM O!Yp"AQDQ %TH_:q Z/:WK`Wnn7~jL3b?7ŒIQ1Nz+0ҩ 6I8-"!1}dZ-"SG]4hMnV `}< EXj@?:_HI|}o(T.TrhD%֊i+"NsoQJ(2Jc 11!*K)1srň\8scc In& O5ZkCZcJ*faX5tu-fO>~N, %E@ࣅ15cN( 8xLԧpPJm/gN>H1NYehP\&E.)Q+gs9 BLpuO"B31RHp;m<8v(Cd(ʤ Hv (,FΖ|~4ȹ~3Y8V1i$4st.rj]ΞjhBW0~k.!qxqj0iΤyŏɝ7onW pc䜘kOdn"dY/__|r{ Y 4buOMݰnh!EA ?u`żD^w7^+#{ צ* dzCELFrϏJopNhg0%yI4W11 Zc7!$ uOiª!9$"k*C4Y+%2FF&QV) DcHm7lBUbgf=F% z}qU H4(aY14+U:V?āi{D*YԓeM@B*)Oh4\dAMPXTTTx&o59Լ)jځO }d{,R{KMQr0,ig#0TMbH.{yY A KH!UpJL09uACw84,})FH1PpwHiٌJ@1%(at\`R3BDN H<ñT nF`yLZY~^Tj-b7ȫ@|# !Ng8ɵ.;e^?o>gӜ1=VuӰ^3pj΢޼&-|pקϳi 3߯5g lui{pz8PšmoO~U6Le',0^D+⥭tHbqn,ĆAW? v]:buۡi龧 8?9y>b1J&ijb (w[ %Q""69k5I,Q{tNOCp5ԦFa`*D!<#z yC4O,EHD-DOPB"i#W\|J 5PDp&/],9E4^k2.qX949ݱDSS`dTKo ޯwˏ𤺆h u?ZysxZIC S@54A1,|)Stl U !9G{^lQ'zP!ǂAV R6*Y3G)Ϣ qƮօՅՅsƷ+PifrYn>;3i!Փ?iP^o3G3VhN֔˨U7iƢKD%ڢ)XQ2}!60ceɥޝLdP =Mc"^Q2٬ݠǢqǮZ kmhv`q<4L&8ڤA 6@S)a|DgIpj%(ՇIHeӂZȈ )dkG+YKc̩'u!@<QY[X#g>YE#]5(E[rFIftxK Āu"{VsPOhJDAe9_U"6Aqy QFI{(n kblQ@tq0 1:qɮzQ֋Ջ^ #lī0JnhF6<IR#pL.s&|Ƞ 8ßl/XF!O%˻.6[&iEQY2I/Vj)AL T&F)Pt\9eɆ v Y2RBRNezh:ɪ٣mh6T:R/WeTR kΆrC Ru"1g(ۢTIZ#d(UPĄ6@AV6{ Odà4cT3aA.!@Զ6."ِRgds򑬖P;umr%S, 9Kk'T Ρ*kyk)8{YX˘plT~/9! +aVgɉZ; ju}Ax3CtZ6_$1j[DVh"]3y*r^U>زZ*H$}pp ٞx#d!:ޫ><$Q1`U _50*KM>kcrBmNDXJ#_;{C9Tyz(]HQ˘0'42%-G2!)ԣzXM6!jǂ㻙v8!Yu16n>[V9s˄!74q^ȼ AIF1I}=](Jt hEŔ+v,j(e,Q "*HM2gI(^Ф}J (OƉѶ==yAYA;nבbfwLOݮʝ[EM$ߺ yd#;v>n{̼6r~N&[[={}^]b{<w@[\>r{χjQ!(6 }l%Z_&')v!v;V݋MIdT%E)((cbTKbJ+ >j):RCʲhU*Y$l>"qL=c)]O')~A;{ׅ{݂e7}jvMd{3F A=w,d,ƂZ#(.bW*&-ܖ 4<A :&PT|"9H2B`*QTn&n ~g={tٺzlkEOb,OY}9uz5#U A,a4|JȈ@~#i #_B _թNF#6!jܹ=z~hQgzs>)чYRAm ܋^G Y_,6\D/8̻Y"_܋M;'8~rp4Y?>*̚-fЛ^t)iH2UJL΁2Jj/k IG" QЛy 9W:c/HA|==\xD<ly 3-Kg^G/{'4sPA1O"AC7ʼnkPWVA'iÖ{p;ՕY.5l.maN/=݇⶚`r}rKwX+8GJoYJ^ّeeaf /,}_q(ø?qёtU,Ң.jY̳aڣo5I)bƚ2&GǞZ)wB5g]hNu(Z҇>ʙpp5m7v˻f!l꽭uH?NROjQh|/tƓ81oZ &k]:  Q#&;jȽ Zك"ժ7!_xBeMe:iQY u*2'}*E 8%l@rh'gDG ĖeÜWN-ilԝQ4}\ Jw:hΈ6φŵҝ 5{VSW)- #X!O}Z.F ˱\Y)/h˝E,'_;qfB)Q{:ԾtO0~*J gM&NSbCB^-OYWA;'/7~l>4oJ,)/)`AK.0VfUae>x))ޛQBJӠMwleطxdž!w{(ޡv89q JU&grnx8F*[VTwR鴅ܑvsVYybF@1 'WXm hR^-rQQk0 SvAJlL!'QzIEK[tZD${"H91b3qvcD= x-۸-u)z~}ys>gܖ-| >y"۩3cgsWŕ\nkGOo7sYiճB\=OJ3sa`%j/ksOP7n_O?n'û_Dc: t*FcUK 007IYCVGF.P(2*4`Qx|DT@iSǿO&׫ECU0WkY_K5_/կW7_~3r},0y6չ>%wW\;t}JEqq‚82JsBhRӇoyolSWO"ϭ xC){=;NY=ۿ7vOiǪ_mGК&W8ۮp3p8Қ1v3j^Zar3uEeiCgtWdT:t u8]O/j `NX` 8Q&u6:3Diw 71&i`r 61L#6A"p&/7ӳddB%@Y6#V{6 e1Üd$_dZ@JPH酗`dr*jn-hcc%sL6TQ 'r*KGFց'ٝ]mhoRٚ .ؿ\1C z;)ˮ[ͼњv[Ɔ/Y>~S/[wUhW#|-H@5Bf^(UPĄ6@AV6{l4n!=s B2RD!Ch!2+ +#Cj lN> {G.zD (BFkG:V|vU1^ _|[KD@hgSz ٲZ^AQ+R&2SRp1^A|r9BLsJ*H6AZ׷z;}}#R a# *,9Qr^kQ+)ih\*&jͨO_qi[>HbԶ*hDgH""fǖȔ]Ѕ$ppٞ8}K|/>nBXKBKVūg[QX hlY3㽔j$(mkw"²}zkEU>l|P}B*}P^ƤoQ>-٤%Go9 !L͠cƺ8hC#jˬ:F n>dEf?}oƲ.ݘ25e0$&D'/o-金dZ.~C_O;e[ "*HM2gI(^Ф}J Nv)u9){sf$I BTEH(I±pA4ُNuaBo<y_U?V~3 玷寶kpU&7؉sּ兛1 ?f>FLo.Bnm6.wסݟWR1fJͣ_GwO9t5f!՜O#+yحOmy|7x䋶{̼6r~N&[[={}d\ z~t>OSo :W[sˡ9y}h -R֡*65]bAO*.Mӥ*1}oTtzh.R Qb uGr sG) cV`;v tmMNHG05ŠzqXlJJ%kGo0^>fK R&QQ Ĩ|!2?{Ƒ M0R/൳']`1!álzHCdS[ {ˑ=ˆ)*Ljd<)V2"D-a$EV 1̮ljX j>ݞ04.Ǔ]O37ЪL=Kn4ħp-#%B0IcI!8b(jNES$xS.;y-Xhm 4wFۨ KIwPl8+mm!kOV:G6Gf|#aL&[[{Ѵ~GJf#3@'.ծ`֢BXEvqoԎggG#J%qJ7MF!qb8J q bvY>hɴ{Wz7iܝ@,j (?<חCj[hC+g8ۻW`捧1Ƨ7[Os</T^Ck%]M'7g)7XH>`hp%JRŜ(aHI|' ߹lyJoC8]m~Z-ZpvRXl-*?̧sv>vw'$N0O0,NB[;P'xyucja}wNJwE[jn< Od6K2=isٜ&r\;,AlG5~K_V')MT0>wdޝ9I33ְD4K,^ɞ{$/wR5l2KMP:N^Y'"r}f+7CrƳO8k@yb(NN''|߀\wbLbZFAX:XAQŜ+ﳆɟL<29&9/\-;u|rv2¦Ug|f)]# )Z2b #mٛ<{/_~kJx\W/k%jPM'qǶ|70HQ]_Wѧ9S$_U3r7*ý䀭NXMGumqP1a:>Wdt?E+.~P,_u3K3 /MGQO :7ߛ9۟$\c('RY&r4fXoxĝ_ԂX2'Ԯj3dPix`,ڨe +gI12y6d@$WaCcl f#g5tF$K-j>^cjK<}gY6lV%jZdki ?:Zy)?->L>isT揜QЯBQ0B`%As߁'C#`  s1>fy2HZE1n6Bu,rЉt2tNCONЅr^+.4y?\9(u HTVz d ƹQ)idl'Yw޾f6hh9Z¿ieYA+V|7c0$_VqL7E*\- },L"0Dh:+/;b* G2, ",Q3"qQN` hF Y*#R1G EmȰ, Ƹ#aREbVk#f#g7F ]h诡TÇӧ[7P> #AT?9@ɴI2TZ$j%=9JM9`FPO ;W->6VNI%<.@I kb(q$[ rϙ>g@Nh9E }J_w୞`E'̗V o EJeK*DFp{(xmRau)ۘ7 (WKMSyzDJ;f2ʂ/m=Fe6곑gEЯG]&JR^C[w obKti-e6#9lFZ`*1:x:-4$D v[4&Ȉ Z# CKnxS^C39HiHitEfEwZI  AynYX3T eh5%=pBIA?) Z)g)ܟ("g6S" {%cGat?ִW@u`YcgB# q k&)$R7;A+H$g{+I=sa^{qI߻jR;ٛ7"[G\ioU&4I8s[ާ0Tcl͟g].+=i<$DEmo͘L*9V` $_:"a]\IArµ^TSPaٵJzsiC(XK -[X{7Kx(u0bu~,ٶzn{cS+*ʾzQb` (q 9q4QMŸ7Wa wwPɄo{+sknO"kgs˛iepXsm> 8\nmEj/7IW\,0/BGbɥ8GnCy0чrT(Xi'BWwcmcuT:dۨms-t^\FoK@^:7jg1x(@-57~xҿo^Ooa޽wo~| _`pFGϣ` ~hr ͇1R9g=[q7ڰW32o{|zY׃ʥy]8U^MyHȦ=? vݜ =m5"ZVoR]Z(6j8@Z1Hhˊ#cQ=WMp(1/O>ƾy4JQ&߷:'ꧪ'm9=r ٠ՠQ# 9S8**C$: Z[}pp3;2ׄE-CrUP6VX@q41'IEAqLp!id0,FC,PgU]/OT]Ot]`u8(WN0=ci0R#Rf0C/D0o,i7ZraТuJ['>~ZNPg}Lm;HЇ#N}`;F0/+U5M@~e,:uqߋ h7ł9 ˭."Ăe1hH(}Vê[됈ljywJ{9 IUp;oB!1qV{tHba 64m^ re^퇻# fk-oCo8ruPp_(?žC}V \>7}9Xs^>=O۴^顭ypjپ'|tKSE@k!3]/Vgn`%X*;plٖ wPD*ZaaJghhLG\ 'L5[JやZzG]|lVzIO|M#LG eZ30{-#LMFSUҍ>^Pi88&G%4a6`Xo5i"RyJ0$5IAXPD@ !.u8~SNbXXmr7En Wڿ (6LRhZ:Ʒ Ozr:[횖˯o䴮Pbe#w6TARKrU+FIEQ@ x+ؔFE4P/6\QT@e9Ml,̝T(-aXY9NeHQk %ImR!at+X"R:"LBDNq*٭Khlqf6v`q<4rD)ؤ#=G&DH]4X{@gS jŠ$F eՇ iU3bI$> 'H !eS#c}904&>Fn}X I1FjDY#N#vqӵPav:*4[f)).E-;,dՈZa`,O%X А)DXҠK0I̥x19{4xbŪ|ٸP(2EN/n(K+pk=F)(M,8EH<`G4g^<^<};Շ2>EiϪniȽ*ƞ1Y7iB$=pG i"&ʸ]_ʦ[lҗ t:\]qөIxuDZݗ G>k~+ބ=v'n菺ХNB.B"(ؠ1h(ؖ?Nc{X4&+?0$)~;^D7Eߋh}}8k,f/='sUg\uᯋe ٻ6r|0@>9z jd'%ծ,ZY)Kv6@XKC.g7伔iH9wFUz6τнzax#urJ=.!&ji.Θ rGu:p},^XmږhzpqKFʶdlH) bT, ZtvUbUh܆h)*Zee^ XGX4zf8Cj?JҜYG="\ΞKL^d=S/YrN[8 T&rU:H*Tx2hצ/f=| *w_|ē&'2ZeA ON)\ͼ0Z։he3MNA<B cEj'Vi`S_:ܡ9[2@zHHX_)Q#+FΖ|vxmبP)S"&ޢx$A9G!L"Y"s= d/e7z 9⎎(JZB1NVN=!PCNLU@CbP_iN/~;D/TfGRD8'jP>tX.5Jd`s]DJqP2R!,#RsX$r9fqΕFQ ýLmND\[9?V& a-P3ϴf&|ˀ4ohp 0+ [hE蘆Ѣdx!{( 0vPrGSw, UWXucnRQ/t $\7Nk̾.'h0QBEJV6 (#}5鯵=";PjHKBQ%&#s#R gV#3pR*{85{&^r6מ6Rgh]7Zx˥&r#ik\] 6w{#$1^w%tl]azu5o{\Nm>$߸uز–mݴzn|wg=/ܼɆڧ-9o緪3p=d\]ǟ$Ҭ7Nmq[lSlYOS`䇫-6ߓm.mnFvUY獄4/lDYW0JUFs[!\8M#۝خlw~{R!0`@ Ȃqⵧ4 X)gWGL0dz?//28ME如hx lzbONx$6\XZU.zҵ*(@kcff1+VEhpT7#/yYcV~pǬZ3;E]AQ5f{af[/5 !O/`Am]|vwvf}/,)rEB<4l}FC4mð>n6mF\m\qiPP2̒,盃Eʇ tL$А[ `5gYQ1xΒzVodA;.v-!g[xhNNsw{Y [MQv#l0ǎ?ќ/ksg1esh_zxb>y^:ګɪ.RK桻6i{" i2.th}y]/Iʭa-v7 ͿBzCu+:1M;Cp현F},_qxuϴ3AMԄ  UpΰM؎ylMGm;eҒD@'=z(jVաixzYl<35ovǼN ~|yy)& 0N<%.Da$=.)T$8oB( =cOhRb{AIIo9{|RDžM/:6mR!i]H=EA(Glthyb[fJ^'m|%e*QT 1`+ kBu$'K ð@Tԇ 4*q%( FxAy)1Iu A[eZe$uEsD$f9E&\CL$]Ԃ3&(/|Ll"\a\-V_`E Um8Om=mUǧ;Ӽ<ŝXֶhN؃SE9zgK9^iq7W"u~|(Gqf+E * ΢GѧGBeM 3fqOd9 |P؀D@%>)1ҟ-9[D8GZuA{j+ٟ^\7>jL3cȧK/ǒIQ3 `U r:U 6"E8@  0"Ly0o^A̦6:ӗ rޛWѫ?7~P'}QùhkH|C7Mi|"/m|Zlh^~TM}?wyt8nh_TtpQ?\^Mdz} `Ү7rbomWa^WN9n|\MfCM?ᯞ)*e0ͯrEʏ> ʅ*xдD 8Cc+hBHsoN**hļ\rb#8Ë\J8scc IJ/ LT%3⩣|biui OMbmҢnHPhpQ7}r0qe}Iq[-ࣷdsIy2?"k57 6h **gS<, _wka=B1H?"e:Cǡ.;{)LztG;XV9 ,TpjC*ibt(X=ӞSY06gfǨKe#BrK`Rkjcƅ܁YF"FjwN blr"/#ڎ3%C_jwy~ Uv󣻻n:^ +T.l V;T6QL|[hYvT2jG0\`6C 2<_AGW3\+hfny *2%OӵCPaEBYzR-2BThN"hScס1iDI2cjr ^3?nO곧j_%O4EVD:X1&,ߕ# h )( rtG,% 82UO8j=!HrR{!v[> IzqwQ_f)U.V燉Av0ꓺւY%\7oˡ___܈24s ?[!MM*b,6]8CRxkTsupcNMy>f__Ox ͼWM̹}8N|m^ۥ"j = dHOHBk= ,oȈ!'̫5d<.zrͣk%ˌHXdsW3_iOďLvtC ? ?8价߾.|ɿǻo{BۓA\:FNM/"@Gpk}CK Cs[ Pr͂fWCqFW3g>~naԷ0Z{ͣc0Wldp6M]atѱ] <"C< t oO"F&͊?"*fp4$, ۛ<t]1 ?O \zm؟:vvdr)}FEA411 1}m >덺'?i%aUC%A#+'LP2٢{+\Ejej o%)PSË|8߹Yި!F%rֱ$'NYUOWA?Y^U2JP-J+4S"H o 0єL"؋JCVS8&A)%=Z{'裘>fٻ8$+_-gdFfF ˎ aa` țD"5$eI;Mfy)LɮfWTdĉsN/k>W3s/e}^.|o,^iՈIsZum\Ol.lHO ̨^Tw>&o{KP1mOҊ󥪏! `̚*׆دCAŢ㶨 6?*Ӧ4)T+>*Y.Lo kYQKO fH<ԤkЕ6SmVJqlcv_MnQua<\̜x8#0>D\,>n~aDOW*HO1Xu!TRm%I\ُuc좈  6 58֪I)PҖH\lq1sq嫆ţ&R_bQr[\ bx'\rstJW*'BIJq(bH77gx/%0e˴z\E~덞,t[9xg7noؑf):Ɵ#^ʈhN^[Ɔ8MQ?oZf;-?_u.^Ww1 cv7sf|fp;Y̹x67rmߢc- o=W_o:^9^d ȝGuǵJ{ׄ\;-_\BzkGWd \\gcaW=0yp5M`MTUfӧg$pu#7/\Z'1x3+cWpM'Mo7[ʋ?k#0tg3ݽ@X`~{'27~tfn[>?N&}w6AbوN{ w{1{>7hoq {~=P3w׸8;c }rRћs'hñ7Q;}ivIK]6ּK-ƜgPT3+v} HM\?/RniE&lyo?$Ж]:gFA\UOf 9qp&7.P Q 1.q@x1;ćD'֞ >;lo?mݯÉ#&nsMNꝭQXR='kOzK1٬I35Hp!iqh5.뭏s:X{ mmG2q9ғٷ ](rlTIhs(,j7b&ϭ<2bg1 %]bK.%7 rͻ}PXRu!voLjo0`RENk3Dh9vXfB ɌfB4cv_Bo\.p1MR/YՄXc-MfNloԕVgjD@1c]ijűYqlVDj)籶cy-& cS44;K%9bH#HΜJhπX8j !]7gMUisU'i(%EMwIL eDy-%YK"]M1()isҒ/*c: EWZdS4H/FM4v/5DܢuRQ |t@]52\U1ǀ(2A묱XԳd[hP]j0WӘjܜ9$R.i,njb@'4v,*50:jB 8 /up<o9 eܪCC gqpyvT:8VEh#q)ܖUhD; X`XTH/f9cy6I~Qro93{h )DF xd!nFƢX΢ 23c5ŌJRP?zP*Z ;M"mkByXYdF cF8P#Yf J$'J[ ci.Y N&B5h2+:7ui(ZCo4^NPqRcA--*w(a5.l栾8ډQ%03|mckvy؈vp!\nFU^iv)憡LԏwP7߽uhոRD:8+ ՔQfxjV-v$(ĹJVoW6#)?oq]Ei[ ݢI*Wy+T`=@ː!H ȶ*zxBbLPso٪7İh*TO|u+E$FR͂µ0+-%N^Q^ѬdZXXvȢ`CG 1 ~U]Xk~XW1ٖhd ε#YЗM$kc;X`B_-S‹meT&G>g8S]Vw?ՠ96Zqz"ڕZcu*=(`(Ke$i1iK:yɉPUCx ~wͽӴdw;erXV>0Օ7!-nϡT<%Dۍ% (H3[ "qe 8› -1<" bۍrWY|Ϣ.5 >@7AM]w)@࠵IKyJ*QfW8D4JGM#R ^e!ahܤ„JX+LvsB+}Jdok/ gk*fo4qpe &2'J6 vRN5w"߉c;PL  .LC_!EW$X+Γ@`bH=EHaH! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! tI @V =$C '@]0$S$@2q$@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@Oמ@`UC!E7$XΓ@`%WH=E@.BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $B@ed d{'+ސ@`;OIHQJH! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! tH/Sh7.]aYjZ^^y;'ͺ^͆C `ffb{.slaT;XiDp)KW^=#e Z^=7htO?y,|ʯg+Aq6>j\ljDUS0l:h`2D+tɂX K^ۦrX7M)ly" aU Ѳ@B<D*dVe"lUe?rqX ju&JTRV JxO*.C'cnS|LG9!sEE"NL.hf9BWj-#Jҭ/ӫP vAYi;%̻g(m }4R%:\5iR-ke]W NJCS |IhB s4Q"5|jhRVldA8Y6^\,qh\.!9N44J5{]zQX8FSܕ囁Pi"^,0aʬRK&oUzo=`?"̕D/2"X(B^}N4J 0$&q9f4$ߜ,MȭFfM?;HzwvT5˞gҖ.?R9;\{&8>["`pw>- |;ط}=E?=͆7λ6zUO7eZK%Ҽvok<_˸?O82ݢw>5y ?ONkFX^evz#?s9Rw55Оjaelj}FӔ8'}c`ޅ*8w!}{\`&7?_,cO_{uռ%n2w%,_r )(srHC73?8L E.`=u]*LDb.|oin釓zwԢ4P-K科)\o>#uGEՎd^JPͯW{)HIbaYp7=T6GgWB'+),zi~v9 3Ÿpݯ͑l16kkc0OvUkEPՆ?ϦDËS7\ B6I@itv~.[o^Y@Ci8}`)ғEAOV=we.mϪXkӓ2gm6~Cl+ʈ=Q3$ op/O:OoN~x7O˓?9̜닓7/C(i 1&laۋok-5oqkn-nmLO=/PՁe?,"sa? #|0eg{5QL˸a,4_5+^_4EƏW˅YUpkb@CO [^cqf=Ƹ6F&6+ηيF9)hqDž}v`_I7Yx8L<.%ꂡA|[ZrC1CBd/1\lV6)L6jgu_}td #$_aD" *&N~+)(eZXt22&D;d*!Sdj-okuׁĥraX_ˍWݹcf*j ,ʏTBs`hG43^A@ ɜDٝJTJb+/2+9F =wzw2bg5{gl[vH_NHZ%eE(4qo tTK6oBeڥ-%p=M頒pk/So0*@}:M+'bG{߆B}|S۫ooѧaPZ}[kznBT͢Otek>,(Z,ǫK;!NWC*60Ttݼh=ڹԻS.mm{l#l՝L6b;{,[6gc:%\#ܝڔ (:F3I]rgWe;3疋.u;4*ȷvs$ff]keA0m\vZ3Idb:m]l]aC̜jz?lAq!b_>ؕ%Փ{,J_YݗLzgEpW^P'uA_ W*hڛ/fto^SROb1n ie/9ƴ*d:J{SV9~lөgb?=cg*j}}!r gΊ2>lRsymbI{7'Tn0}~9φykyQ3$CJZ TTpp[4=e/<o*pgmreP02 1uHª\ t+sn~4k!xj}k ڨ`ЦI` 5!&H%qe[KgZڐY\ 9M :Y*7 ARQI d1Xb&(e$ꤝvf V:x}=#Jܒ'e+mٸduHte!t3 ni:IYJAUٲڬ4m6?F$< fBse[3snNj~qR.uvVKU~Q_Dxtm4*djLIA;lM" ?'Lf(h`V wvV;uP> [:ղFozG*O,V ^U7(EXiT]b߽Bh<}Ve]lnjI9pVU\1bo&*rEZ~k k0Dsܘ[XbfF_FEL1MIa祈9o.'Z*ÜLeɣD]습*ѷBY$dգ5a`tTKW$ՆNJa4>*_Scyc1viPpP~vZNyv |M2E>9l_F<>.l3X_}ʇv!eJShepngk,UbSͅ |PK/"|UplS zYt1pM!/EV=/to{s~i>zŏG^׳,^hDLgqpދ_^f7^׃sX_)6./M=>ˎ0{W?DvL}M\*TKk*d",e*j/mI%Y1a` =eG^U^Z2t@ ROlxL= Saդ$2Ծz(dz˧Ի ɪSB*F2U$x$!9QML#IsinǨf^iIώ!@Ԭ 9Y0=Cjc/:aN.j GQX s~߷s|G>λe3T;[Hzjt_rpܦ9%}Z)r1m/jS'۵./|`wOI,v#vBd;LFPF TBwHPp1#6J\KID|0&F ٔ)x.?,+j!]Uh>;;cP ;}P ۫ٽCÉW>˓ttN^a O<\#叧z]N=6}k،zsvJ6 yv)Ӟr'?-~ sÖMz3(FZ]cJ"L|vKeX@%$-IsJ2DBo䮂g7Էy8,2P{^@8 < s[U]Lz9h椘bba֓nЫVI)Y)&7o; Sz_TWKra)pm WuzhpiʍY,N\mqHiۙ].,S}-Yxr`qs{1ۭ::&ְ*" Vy6 ^8ۤwF,d8sVY U?*~/Rr= |P*@)zNZ*Ru*2'.&0x 9FRSv/,8w ҈N8& ,78P c0*xN4h5!AIaI'FQD RQbZj6q^rzjbmW|p4 y.VB G AjYWJ&ILq$E`LIy(ނ2' so6yͫ&8P- %\QKhYIUI8*ʠeo-7ϫCqʗ?ߞ&UǞ&gOa]exzL^~ȶ׳gM~ϼ{G4)g %}0MoSm|oA^˟V)Bk'm`oK7~v|{t\^KϿhҮ>QawmVMƣoZJ$.ϏoW^ǽ5ئwcA(iܑPyZt*MFZpu32UI9aEEL)נ뚓)c0!9b M_Ģ jmQιXDʢPZSf즴l[-b'Zwl-r=>||=8-)O%3>&FB@؃ᜇ4.}ǎgӇ|}M^ȓP[_DԱ/Ơu1vBOs|p"?tͶo5բKnW)4rowW|7}||Cy_["v?v qZ?۟c9ggC |o;n8[턴^iPZ̮1wƠf.g-|3uEei.Z58{@]˹S_wzQV`vª;lQ3җW6 R6T2\nSaL.ـa+_Oi,$7ä{Ǽg - 5=GʲѶSqsj~\qT^$$$"PdR>N8 F&Rxnc K6VkcPG!t {ʎ^)2d4غb3qvocV:K&9c/%oȬƎ}*9C7>Zw*Tq>~CWzpG);'1QƢD#dfUFA&&Y Yav0LLsԝy<ٱ4d(By[ okb6䂻uw\8D}5nyp;]..>7bR!,[.BBh}t_X b,Weknٺa6n{wn ~(Fa>mouwwQa1ge?wj{4;-pro<ҹ_5ݗn.oG*>}-ud LPyM6qF | `;ݹ>X?)~oS.Ǭ::4YA&~ w;wQp))hRa\H R&RQQB6e%ޥO$J+ .je):RU*Y$0 B1wo&Ξ}N!vocOcg}|1?<q]Qjf0>.\],U˼=`9芳/Fh[S9x Ǧ4K271w3K*ZJXJ7[Yo@+}/6[^~Mqs|S,f^11bJYc_QS-ūы_Kň*WR*F# ' n?_ξ#NeN }4~e/ݲ|B Ԇ١j5%F'JNY%\ѩtP$':dr.嬼DٻFcWId~Tڹ6bAE?%b)J)T+q9$jJNT>U]Uݻkc|1G,NH6; [ˊ.k/=I>yř4L+(cD|tEY9- E21CN&"]>Z6=0|J`*s*Y!pڅ,5WN2+£Ǫb(<{gJ?Q=kRV?h M6.6JOs/iB8oL5oV\ ^kvaa+Sk׎Dztx#V!+'q.BʼٔPdi&NP!/D( $C=2A@Ff6:rNB ]5qv[#2oE63E$8R Ņs2Wld=+thسgbG@胱/tͯ7jؐH2Mlќ$:zr"jrğ. `QO:%8'bPX c ) ֻoHѾRUf/Kp^9v/ŗOi&6Dv0ٜ|*I)-h"U9rfUV%pGU&} s ӴFC%OIv}`᷽}[nCQ1}؞-OA߯v]`|8'N^`i<FCV9KZǓ: iMü(L\J/,r`C e \$8[oW -!992K&Ύuz~4y9- cCKkKnobWȢy\wҴ֋p-wٽHR*Q3U^#2EZC'/A&s4!9$=]8Za38ʻ,uٮQdEi#D,ld5i"&sfN= >t> E{cp%Ib=YwQ(.zْr ,+$4h}Z%挐WgQ8TKNDA$wb1X^k SFYiV'RLcu'R݉OUw J(c9S>*,Ͽ\z\NnA GijFL\q'9NWoÍ0  >rQ kOg3{].iVoVf:$+dMGp-9g\2qrN΍d?eݝL'8rͼmլ]b.p9X7р y,E89?/G+0 `o[O\HD#ggҍN3 3ǖ2'q= |֊ ZB;*ji~`vyq3ҟﳖۆ/n^'gkn'l y4V9[SU`E+ڛ0\<.{_wm1CğWhdb)2G64tŷWW!/Oۨ'tQ:B ²˳pYdr.H0f.鶴7;@¦0$5\'<{A+hѠU\ cJd 3-c&TM*?SM'*?,?/lي:(xgC5zx+)ϻ/n%Eڛ,' 'D7_>JZa]vQf?/Aj[b-6s }~vn~6oat{>/:XEmLooƓQW3e6%z@׮Q˧'YcKQKn8} |_=u⇯?5mIGDh.*&)q:SEԞRS 9`r5EY6F`Q3#U GP9Ooy62U7!7k BAgCaC_{$M2JҩX$T#Uyq9-^ *QQb*&֖ibq%HHs:g d1t}LTC4S)99MsNz6A>j##~˿r<&:Q#j'Bb"1KZ9:'2{B0DIf@vU~'鷄^rE%D*i-pT l}BD-qOTn2vܩbo^famAp}3E?Z[.%ͬuWw ծrTa:{x>^BR;*΄e,2#6@ȜG&LJh-Ȃ!CS@g/\l{8򐸊.rNgTɻԄȬS$K2&еN|LA{,/{ܶ65y?TK[M|k~ؗ$)!(J* I|(7ݍGP} !l9?#uw[aګڬW`ΕN_=*btDB+5)"\ےB r|)"JZ/0V4j]E=tpo ]EX:]EDttJSƘh]E**]}1tPg`x ѱj7Y.#GJveE>b>]e;g1c"`ZCWRm+@]=]n]`.CW]OW%Q]@L [DWX>ç̦*e-t ztQ* h` spTz<&AqM~9xz(6<:j&ET4uim"hxa aIΠF?}hM&E9ZF)lw`CSc ̮/2@W@ Bi4J4AR鄡6wa%~8 J tFLԇޏGy^{ޅyyu#ggeyq\Azxe'N\JO++H hݷo?Q)|+@g% 7_}74G%Umi`rҖsz]_V.z/Je} |6EO@I1nK#Ŕ1Mrz^g7Z`xkW3*yԨsuDec Y#XTMr׹'V\RV5xSTU5<˫'i46WJju8n6 &uJ)qF P}I#dMF0qW}L$hpgX6|Q҈=(e%%Z߁$"hY"z'kX=J4.$ &c%dVN{ؚ7n9AH.aL}eiiLzA:g3n F͍==YwM4W7 ^'ц5#5չ%LrVEq:Yj`_~njj|>Jx8v:8lo*ov5ѧsP=;ԗ;5FP ;Vu9%k$3ʇœh?*ܾ{bsM6V|*e5\_w q=ŌT/06EVno= Z3^O(W/SU_ses%Y7?us]|Xk/r]&!m #!aLPU,#^BP6a6}&瀝HJ C7`ĖGĬxK?ٝ*խ9h%>tt"J/]76ʇ ͙L".Pę8z~G(߱W5] SX}y12:JahUpF.r#y>&[|02Bzs+J z_0 UeA~ 4Q|(WQ>Rl-;Z6Y4X8`"\bD-w4^ED"`ў\$dmޭӕpK+-VmE\EZDWmFwt9 XNp%9# : ;jߡ *[DWԭWD*ĪHWDrU c&[CW.mVSSvt,tE-1#\BWo{:]Eeg+x!U{+­Ѯ"ZN"J^"]qU+[jpw7+ҕP9Vst%Fp^Stbh?u(eЕԄ6XPpj ]E:]E2JiZe_\c0'Ƞ;z9t?"`ulԄKcV4t丫В#} %;v]z 7wyk LBW-UNW%W]@"HT9}j ]ER-=.t(qGW/me~sQ`4ȮM4^zBofڇW ߎe[~9tI&ٿ2 qFaq"v %* %Awq̦0L{-^٠Smd|t3.jxRÞބŻ)|x|=p؏ed\L\Vη -$ _A(&,L5֦j)*'1?&{kJY\?-zc KepIrQ30>]LD!e!x0k5f,`"L1M=65.%>Jk '΋8.E^TGܼt>3壬Xo|D~lYaӛGm]>@:>xJBN|z7 [ sb#3qR]\+]Yrz} }.$i Mt =rkk|WFh} ."#^tQeRXlG NtX:)Jo9ʶ]Y/*vsod%r{I^.jH<5߮1_G弯1>-R=]GIa#GG:^mjΛ?/YL[-$l=qWe ʚݩrT|n?K,`MαP瘈sU`7U'r#,ގB֛ﰵt?xH5$h >AOp=A?F Ffw #o-DIE\@lz)# jA #(H8I /a{cpjSk\2?ߝNA?nK76bU=t_w>17LENv6dl_m]!j0UrABpPbF:r" кkt;yj5Z`׏3lSY$/NP*u4}Գut|Y,6EaJ+:8%&L` Ij EpikB"ZN= P LPFҕΦau"Թ^ۋ۫+ ciE.zi0݌r&*WEQwV9̜uDnkSǤ3Ͻ$^IJ Q6X[f๥eR> wnCM|t?|} vMXMY 8mvM>r"Ӂ+z^l+}n =B=ېd8F 3/FK4:@Q3N#A)a#T|a8iCG G E+cAC mS*0D*ot@cֻ9f*; )kD{))n;Ŝ(I  # Ccpz1jǴ!@wW5B"ǗQji9w@큥Ÿ ݮi~Ӂg&cx8RNRJCM ^[EU"XpZ`viG<x:y8Gl8̬)L *$Nr& @B,}阧q8/; ZE#3GW=JCF2A`V5\R-4As4:K>^L@[c[k}b͚48Deh-p~֭>Ȇ&H<.c2+3Ѕ=Lc3GPZ\Jf<~bv6LbdU^㷽Pu/{Ս=w4?W3M>^C.qֿ}?0s ܘň>].ϢкF7*;#OZԇ^uWEw[)kFyܦA5E!Vb:O1~ENE^cތ鴎tF.H R"CHzfЮHZ7Ou±xR75rı5goU[T!Ra E>{u*CVnNQ׺㷋U {^Pj,VAϠV懇%ȋnʝⓛ`Sl@:0&n[A/pt>6ʂTg'YA e|[JNVL2M9fNA4aw.}]eͭtp߼9V}W1v`^\aI 3@07]k\vQjB=˛ږ[a&;|alhO٭F֝JK3YLcWCtkd_~)nnL3@,2N$B#s(P:&g&A^rI%V^^rBȜY:8:E$SI~v.d0A FqiwtbݜϲVHaA':&M"kP 2 Z0&drV嬕O1[p`VD*g>@Hd)@%J 023sRRE{AO?3G̍/g/N #$[(ewSR9mS&CjP+G_.*S h S!L;C-DhIȠ3*<QuW':3 jm |?e%8DVAF)ILJiw$@Epx S.l\/m<"DbM%=;" ,|n%|;Ph}"Js@ &)NƠGOgţxlئ'!HG5S:4+YXu\KMlK;PZ QN }3\W6YMƛ46[OBD^RqJ  FiI-17g[h]'R<>>rr+!DbIɕ'C3e^AYhnM 8CUh^>` N(S|=㻯F-ǥɼmu\&2 `$ Y]#g__ z^/1ԋ@}mɂʆi [ӏnեqW|r7u"۠V7sVxojs7˛{zZ.a|1ϖ&hAꛞhP<fˣqrnH-Kŝy4Kq\9ˏ*5R9*ޜV0tFʍ5,L!*,Bo=xN-M6:|q++v8o'cNTZr-mzGgV)FY1ymfȀU;2L̵2r-vSeu]'myqs3\m/d;RL.h ǃ9]WpR ۔i[G߽zwø#95>'>~$E۶lc{ewǝwI׾x4ϣ :,͵:7> KoyrZ@v2}&T3 bD/5>-],0k6R1Yoe#REmP8Pw@ BAd'V13A,ѨI"3彷nEFgevSPR!8r:%g3'}It0`#jSSeRl u̿u<28+'=Ryd,iCR,2̠nƲb L$ :߉@-VF@qt %1!XEh͔`)NK1?<4Ko4O//&.U긺>#%~"14xoNmٚ;>R_j3k,E3@Y%'"pA@(!=% a j9V<5TuA$ rVb2HϹV>sfUj.uu7kߴٞ/ 2Hw[n .gqUM_^ha4[ "8'Q&+ pi*RYI'h0Ij+!`2!EQX=UL6m5P9#f-ԎE;Li<];ڪVEvbQ״$sycN\.f]"tX)tI}Ua@sE2!C5)JE#P@4IG>dT V#~}s-W#Q׈wZnH^ ]t<:<.19@lB9. Gn׼n1C؃JrMgB 4JɒJo%FFULΐ^u 8kVuHj\^4b+v2-WȣpZz1jܱ>O> $tt#wu_peEC\LяFHɪG?[ maeEWw4Ćj(rԍe&ҤjBd`* (dd kC㇥^>F֜_1覹5|w JLuauDQ|zV w檐x*0:^] ZJl ZI[07 Q >K(g@ȿ~JsVH߳r{niFRA$w8a5Ab1 7Ƹ8Zu"E``L P^ Z`F8 _Kޑ-~>(-]-Šwdb|0ČS$M2s^EEf>_f6ϋ].IRKw7SȓL0&kV* #d q;:F laty43)}L]ӈifYޒ &OZE+XVrг9]x笂=!fzVՖLʉeL}LcbF+G%?3=F7%HzM ?7Ã88#v߿=o)?~u~~s7oO{}WV`Ԯ"AӃP|6a55ͷZajOZZr7{\R8}=ŒrkOaAvpT#Q\_Lijabzp5)Vպk۞i Ӓ87`Z֘J(!|zu .DtvUbUP-E1D=za7w?iՇUovt$<'Y#܌g [;ΨWo[rW9RA$P#4`}!9>ϙ<1$dގ}63/~Ķ :|K=  ._3^-^I&c e"TJ%h97Χp2A>EPC!=«As튍O34AOvWG8>N8>NIgI=Rih# ludq<,-l@:$;>'Y7 d}N>'Yo7 9Siu9nٗ\u9KOr={z(݃==_=G6qMNy) `Ha"S$@?w# 12Ce#0-/28MB1q::뫫zQ:jwlNԏ[O f8v \}€C\H9/M"߿6Ӡ^%HDb%%@Z(E>(HԬ7xkIbRf Kҕ="DX8:ԙe/7,AN.g;=|r6Ywr60uoOf:1 ~ac5~C$?(mWx m<1,q\ojW~^RNXMFˍxeλHlCKvzkz_#b*nӦնk(7OΩkhKg~y-lx }?.+HZ\sWܑ4(%.>_ͽNλ*_K7\{ZҖr&Tk}ټzeX0eh~iS]G{˔̐>I6HԘx+Ƴ-}>lrOWMjئ ǣʍUM7 Pj v ݔ1b"vÍ&rsúgϝ+tĎA#H3r˥4ΆEXjIe@!h\A$"R8 \kv`}>K˽=qlPG?OBQ?," aT)e-Z=px4 QOhp<%.DaP $=)T$8oBzE@/@ΪY__~B.67l![ȜO{U͉m¼V" o8VNqeȏ 4"@~d@F YGCLo0˘T Z4P !d<8M{Or,$ajj^$&vGYK)ۜjC-\EC0^3ACsRSAx_rO^/wŻIQUbr2B#11%{f>uh7E"p)kn1=8bِJ'-gDR.ꬢt#*!NRKDP1wJPpu4JDz!$5LBq"G B;4aI0 jmW޳$AH\^XϊQjlafWХ.*wiLjoqx$AȣCH&,EHg^J{ Pw7Q4QevM 8Z9Bk u: 81RUZŨ_5^xpN$\E`NdGL PGs~w p^9ў$]{Q 2l> DP9$ GO Y\9k 1K1aOVnGe*"*)|QD6x`PF .fA~T蝴6ZD&WrqG*?=(6CvIjGۋa#lRM&d;Tqg <\ffkh4(C%B+X~ƈܿP{v %gT,GιđQHQ3+FJ8Fݜ1LhÃjbDDuΠ)K$)Phn8O btLC !uv<֕r["Noم'lZR\EHhϸ5}ffu wzr5lxqk#!7H l&Bs>e9xX[WÊE擀֭c}}˞WZndC˫ÐyaOƩw_s-t9Dpn`tM/v5nc7Uݸ#?'Btvodk4)o[5t> [=GViU%RVH-N'yotgAIXPR.@|T=<.OJsrCt>yV6C_)tHnrƬI:Tج]7Bi7d)lӯ~ VHs0FNhT\{%a$bP4*S`oL5*Js&jcI.R HTkƤWܢR%OJ*gY@[~&qM0%޲|+Rr~f˿F?[ [,4_8|*|>?zp(F;'" /9A4'!8^ "_VeRL \ zV/Ĺ3?6N>-L ?_yKD\t؈ih-v)m1w2;^p\klh> ?^eE矖i/F /jyupU_̯|?  aom;Bop6n';QW .O ' ٰ^?+XA_T`GQR%gWNhZEaSLͮQwT9PGHsoQN**41 Zļ\rb#81%GS̥si=76t:ic*O5s(sL+Mx*ﯘ8MeDBw\D}wV%S乯潵 2N2^C>zm,\8\CT>}%ټVF!Re,||DV.RSֳ5+I!+a;Ȯ7b.7|alJIἡ> HN@ZN'9ʶwg2cɴжűgGoĎ(սK~ݕ!KRp Br-gRY42(}mrD_!Qٔ`T1s:$+P6 Ƞ(NdQB:KHԛ@LNkۉ'_̲V8? X[ksMK&Eʹ&V4T 6E8@  0"L^y(oރA4l7pgz8_! R})~:$@=X'؇cC5E*$KEđ(j$A8꯿dᷝ}DD;Y0\k7b0&?[0lL0**DRb'us9;s;s;s%eVQKy% Z%4s΄vm.9F3dR=zKOW ~@-FЯVGorJ%&#Y$" Ymݖ8$.5v\szzl+R]~icI5Ui⚅>6 =4^2ش 1_}"jbII,}I+ṱ:$V`ˤR"Mm3i}8JwQ-{@Jqu/8r$zӓ^VW$Uu5W|#rtMiIz̻T&⃙̅\w,ԾL/rU<`FO2лbLפԵCn()&8㒉3vxl$($ɔ@0 _[o5zcɻT?+A\RiaᴈϹ_G.sa\{^|L_{g(ȾZf JhԬ¶a.jru]SˮӟVh5̇ޥ>n/nFvkv7l0Q/9u_<Vخ8HՆ 9q'״>-ZiS3YUw4cj,XVpu1Г6Mت`7dSM}5Vh) +Rơ<|B{#hq DIox&cpvg'wGoG߾;w?Uo4ha8lA{{ \k7m޴MK 4mmvר1^h\mXrYοPN|]ջ0ebG-ㆱ_!iJ'0DhiE4?+^(@85X&۲y|# w)΍6e6N"f(#N?t:Ur+ZQ?bqAF_܁z5m} ֩db`H6RFgMN~N3m# | E,D`yҤ,0kV)D/1UhҵΨ{t zϜ:&ɳRHA-3:d?N'pU6loઈk`ϮBuplᐕl]Na&.<1x7i$el Wն~GpUroઈz_UOŮ&\=Cp FuQǩ΃zgn<)jIkJɫC%{JӰYLב&gL̑N'y= gMVlX&k_,w?Y# ߗβ{vV1/Ly:-,pw۟1ҫ#&=8Wyum *`b*HjFLD]Y/*9)ч2$}g#cpW>CZH^Whl:3zu:R_Q' ,LD!Y#(T^DE#PJHf׊G9׸ͥ~{Mj.6H a.'J"@CovF]p94|:Х:|jSi?bG9im)>ĸYd 9< l1@ȶC$;\O`s1VkM00iX*A@j:>HF}EH+4 `p*1TG B`YO^*UςV=Y=Q?G5,zJ%C{s-guWMMԮ\)vεdis I{ɂ G&DTfO9j>N˨5H@PR\ѠIDHmd:0\fr>ٰfػPRb2dh8?ͅݶYiu#ty!V|z2g=vb?iˤ"aL:PH:h-e?e¬4hƂ3fHQb¡wptܥHPgf϶Hr!TѤʁgpk>o`Gin*+l}%Ckt5Ks=#L_<ӒeI}M:;ڑrH~P>` F > "FX`^rZ1%,Ja7׋.n^̏A߯ċf+lA}9~TYˊ_/koݗ8>9U!jƙV Q&25NhBYOCAs5'+4ybF 1WObZ;n~Lj ~j};&882S"Jm#I$ŀvpM|~K,k艹R!GdSzit=TwUu4I9t^SoQ+K^C Lg8@O' Q7ԨNZK>$P+e2g9y2]w:ˍ@WF߼ՓqDDz!h=s#a~ 鼏tHoj5}[ojxajZ}C"%K8UGu˚R=GO?o5{B?R(ְ쾎k/w Z{e-͂|.+בӾěut>vϝ+qCJ=W.ϊp D#nO=MM>lޢzHի7=4қr(:E>ooʒ$+M&]]N:PSq`}&zcFnt(\_ɟrqC#'g :j%Jg w}{?Fb,snFُn|vbs`紺=mBsZK4]}</~80)2a45pAX4Ce;%P b1 nDpTIzE@/.9,nDb+ƓX{=.u#gURzm+[^`6籅-.!J;+TyiM%Q蹶݅-NWA>wjT\Kw RcJ6:JV%^ `ۨ#Èy Om*'A)" gH-׹& Y v"oi>VpX])gWlL Srko%( xj +ޓfY\yg1K'ek+"͠ʱ$gc|ȡg my0b `34|GwKf"s"* / BQѼ}⼷ehufngZf''ŋ{SF7ܩZS}.kWݺL'xT9 ebin]5 toZ]%R뗒ZXny˫ӐqcO'VY'-;\>i_;wo^uԍI}NE}SQtA]|sC)\l2_sORp2DW3W)JƟ o:xdm -r\eK FJs8cr.2V{}BzOH d"&rR݉ 8oϭe߼4oϿʗK’MĤSFZuD*f hDQg3QX⼵{]ڧ &/Ww :=,vhvy- 6v.)E$tz;?5kfo% .6*6WnwJ ^tJwu]D?MFwv7Xȍ&p?NW:,_iSe>~pQۻR?^.d/~|7~Q`K=j,שyf)'rL[T QS% M+QLY ?lgAo <84ڲ>3Ă\rA;Jq\8:Q\J8.p $I[UI$d`$PO&MḤ333Obw'=lӣ;wTvl]ͫsϋ._abߒ{,=wߛy(L0'a4g"ǽԯ:M7M}|_]®n a9Vq75"/Կ?qRsZm?L5[9NKf"X]Ԛ)k;:Az|ΊQy7h~眩/׶Yh)nLk:y-XVTmS^ -=>׻Q{tHu~ɨnyxE-nl2o`ÇFͲ'ym7S"gގp4wy3qy6jſ^%O%NgTx2 "βvHu&M$`\啐HМYM쉳g/'ΟK(HA "& QSbe:jUSVҲ!P7@ :-Qy "$͙#s_+RXcG@2aeKN.W{[bAEk ^iria99+9snOƎp@䨶棪ᐕ MV,d$CV@*@l U80DK:%NR@S`W4c>/Fwzjû:֧Tȥ{Slw>Rs,R\VG i"tJTģP%F;)ZY*jm׫oO>O* Yt3oGKW;uX?e˃juS9wN3{}m~M:׫'˜EZ_ Efxm33@^q%(=qjf̫vC6ъ*!LLbsnT k9h{F",E"8$˓!Uxh``\$&TH&( RE+9PΡt9>#JP]Bl_iIK Ah(˄ ] ,qa%'@$b. -#Vy 3*?Ep,_WY#2J/RG9Yvf5IΡ`q[?ʲxIr%U_L g &gԓ Dc"M"J|4J{N!NNRey{We=L0YtMJi*(%XLXb#P,䅱XxR,|RkqqGhsgq<;iPxdY' '` @ :TTp-eS s=,&WJFз#Ƣ7lAX$)PJb j(b@<&A BMrI)%FIHמiiCrFdgIr0@Q%PTo$!#3"{ЯdHP #b~|L2N5hWbک_qx6h",r@wzn#jW`q`uH<{v3 ni:IV{PRpj qGj&i*Bp+Q U{9u%*jwnNkBȔD~9{dI$ 1J J1)p\8ua<ԇ `6wU{WJяr-G?S N9rFVpխwT3ն2{.+MT 0_uxG;;G;:.<*HF\كDHϒ§ ̃S\!Wee trFjV3`5`UqQ8QNl%*JRȅnpa\sqslkϳq)(M|.9K3V jvק֒ |Im%|f37M_6HVF$zrۖXlvU)X{4bd慗 Zɟ[gW4!P)\C&I`s m~t!݊uccg{Q` 8f>9t1XJ- FG`ƐaAc6I.mj%`Nhky@B0 Odt(8Rj4B{K{6 ޮl8-oh@ʿ>i0U%I @CvCXj<6DJ״D:" 6RED[etvdv>)Jy2;l uAF׳rfˡ. LiݠV˜[r켢xY-Y24R ] (w_{ ~H㒌`tTƪskOB fZ)<8\+ BFVЫ8TKS+26 {T [Ɖ`idk[/ 9flސ5C:{-S+&q~9jǥމҋ6x=|EX񞢹XvDj0TXƌS$O2!/bHKb ːW5?z辶{z$S[d; q%;9FR NlqdvENGZ|V/p)'yHa X?ku0\p M0}XTP5sjv: ෪)>ebhfZYqիJoXNFԪuѴa^գM״Vl5[ϴh=7/یc]_/^__/>x v FzoV<VnlKk9po"I4.Hͦ#COjE3Xfrg>ë}xcuM6=jægD6R:֤>?Oʃ#~O`2?ʏP{҉ _Oo嗷^ǯ>p>~yMΟh0uMCCAPz[&]tM-Nڹ.z1Pcָ:W+]~= gzѫOazGa9 cWhdnsjl"f._DlݯTnjCߖӸ4[tHՁlNUg{ݑ'jSwHrCRc 9R,4<˘ *-7 3#S!T'P3<A7%+庼)wPIC #zj4d]A%o*AR*(*$UUep٧O ɹjQJ6pm9g W9( H|m [:&n/7>fWwC66>q7ɧ%с^{1/Hs*+벖 ec:6$2̞̋]w8z}1I^*E^Ě94dь#2$P:7ɄhDab4,Z0{)Q)Do۠ͅ" 5Ajm K(.3Xi`9H+0qRdN3DV z&MJ"uD,xٱu6vv|ag{>$ ^)cBMDс-ž%F ̌\)2zn_8~3"30]b1`4{1Wr$T&41xm"8F2[[w;"4㛊^ѰMv"y Ph$J<Q"p o)zu3̒ҽytg+0vTԝiȁֱҙu|AE_š¨VTujɮ \Dɔ[&$qϔ'\)WGn}lLbNc /| )?aBQnJyc*oe1ˢ}Hc'NJQF+睉(YspI,1Z$\0B(K>tI tV%w0d K% $r$Bww64T]N'S4vZO/Û ~a+j0>Am6pS"۫xP2&^rW*KYi5F3ŲN\'' /IS<9AϹ>R^ :]"Bdd ʉu}p8w/־q_;qEY>Sd=3܌7N"wt9vG N^SbdC'/dw4GGi.gM)E_,JN9@SD1zQ:&-K c9㰎WcF}gӗR*O9s\am)NBX*pg3{GE8D'D.DB\qck/B١TlҀRK " vù`+GăRRKdw0!j9E. n0{6'Wz3#$QvjA-jQIYt7 3 'tOĎ' ER"0落 FILJ:tx&ӫs]O=7yL9PE&wbxV됸f-aBIH1RhIMQ&M;ɾ.c -l-80tנ./dGx '^ xhw-o܍gI'}^(сe6K+lHq>z1B,&D?VHֻ,,xQ2t)zm`Ƹ\!Y᥌^aZn`-kx<> ?oReH`Z9c5id)^Lnh>!2”TcyjRgH]M./K꫺÷g'?ُ<\ ΆW$M./t+ld^~<;MS8Zڿ'~Z.OD8yyY8%CR+PsQ~A0~{-O&a5yqvt&Rtj=<⧒u#ߦ#a6-鲍6-owH)|Ԧ6V+Hpn4 AZ߶ĺ;uyF _Λ[|$h-po_*i7%4׼-+:Ulp' |Ke}}ǫ9tk L":v[ ~%r>sJc"Mٸ0>';dQz~[C25X)Qaī&c`8usZCyG::].c+#_r8Kx+QKL(Q6H4my'mǴ~1dvw.F%𝭑Jv-ż$,W6jc:f@zG%ۭ;.|OT7Cˀ yHTƢmG+vC:|VHUL'TXa* ryABr꼰`  xc ('@"8<2x)Rm ng>@Ά:іC:VtBƺg?sYPj|b> lSt;l`++pűGGτؿ2Зd!@xplzX/1>%>Qce9gH5$E/{A )1]mSqYʣ{@WW I zF|@ijo'_?vZ!Ebg%>yԭ;>ڜ)). J4Fxm:! j!Vay̻ UfK_7:=LN5UO.;Ļ^wvsg)N)qn-3:jy-L g!ΚsN-f'?}l9 `hn)144J澱zwf=R mM s.!pG,OSʒJ˼uqK$Hz|C{`5{̈́9ϵ{>wBA)8#JO6"RFaxT{ZlCȼ=VSh GvC3)v |ckjo|VnJ]h (˭1g2t0Jqh |@(uH,PfBcr}cxF3JhHuWFcB)7j<ґaYqGk)$uf6pˈ=i1ygrxb{8к5>s_"LDQeJr$ ߂2QxT8\&|$%޿5[w߇na3:to~M)^ \-v{B;d@dQ(`_Yqg΀3/[HT'\wb$|E'" o>L`) =:+U?[?Mlˋqmj\yoqug|EV|i^: 5a`K{d~yVas^g`c&&[5?'KxK+9Z}y+2Mv0/pO 4Dn7'Y{(ƬgJ8.Wvn_PviZ߮:W00~r-MyD]_"$VA;WS)r#w`ڧ-jt%,NTRED~cK*yiTSNIps&ʭhā@UzAc)ACv u Hn*Oaup7ysQm<* !ksPTPH"JdB =tSJ6N;N8E`-悄u+u%6!`BL1jveZoAkYYQ8s'ud߳I /zÞ+5.nI^x}Zqc;y*qH He"&V Oz!)Y\촃Z0~5@KhD| ჉J7P)cנzvFḚ4.>(" (O XQq= Ľ Q!XWAG8|<”T ("gpn) kSQ=ҒPӠ^пq jZ)_Du`d5"-*i"FL%WQMSHHc !KH̫ g W,'Ϲ!ډο'S*&ElÙ~=@O%sb\Aֽ/B I fIu|~:O6~ɷ&Ha{@~<$ <اlNF-2v(8F"rzNS[~8 ޞGc@P& -,I*M!-w9*Ks(tE ,>Mvg&Rwv;S0.^.ss~~6|b:u+=SҸ@zkB4(UO̚FOgʑzi `.# `n͗V'[WRS"@pC6CL ROΔ<5m4z+F@>t,{u,ZÕx"fZlə4R{ҫ}_1K3|nK9 ^*'>/ku̵.߼zMW{uKu;0(*V@  m7SO-Y]S|㩩b+LTmO~ĸBlW 2//;uOa\öM&h$$%+{y~FڍI]Q2,D PSb|t̖ĩcPŠ⥉=cab 7ط~Ykd *0wHήNjSg?ww㏭9=r ٠Q# 9S8**C$: Z[}p0(u;9a^CxG>P veidPDH)' P. U: ԆkuLI`g_l<p[tzBloJ]w[wdG g~]Cԕ0EN@F͙A6ƒEiC&(J]ԓ;+ǁrKIk S^!|=b#ۦB-!Z\.|}nj?} 6ϞD!z}Vt+Yt~3̆>yM(({vB [rIX7\q.p:X>ݵFRHh]p0^SZ,?ˌ\qXt߈?0xU_&yb4~,oP\9F%0EVUyoZwwHtߏjT,;(Pq7}|Dy$"kY"MriyC^LN78Ŝ9s˭C:F jSH攷K3lwr &uD9-jk-rm|S⭉&7n:Jol0Y&#μDujLo~t 3Lǵ r xgpV{tHbaJj T9zf|0ʧ0s:L]e}q&Egס $C9Ͳo&;:l>vn:N'guo1nw![)l+C IERҜxKv\Q;أc>=bu4rv8$URHpQ9g@B1Fz[LGX d;cƌ S{e4zl5ͭ/j FCjAz1W_OEM`:Jc~U"Eb ]E75i"Rj`Hj*` <* 8YPNbXXmj&YNWu)ζ:z=JfLmxVf̬Tyf}ktdՊK`sǷ8`prP#)+GcKޔ+M;~,r9рEށ0l+qd[B4>p"+U);ӎzEd>ZR(PIXOp).}Ԗx93Tݜ6psdxam./5B­ Dima+5qymC /MݽocSZ@$pEQP S$4E^P0wDm ʺ@a ʞQk %mR!av+XR:ե LBDNg9ǶݎR1k׆rmV3f n$E4q :)ƈcϑ cj%k ȩ6bP#zj凑 iU3АA2,i4^I|@N@ 8TZ#ȁR15Y ~JP8bm.G5sDpĆ#.`*,NGe`=1r[0@-}6# Wͯ)~ KF2L 2FP=x0*@(-h[;@Xϳ`dc[khVhtb(6x UHBk=«29hSI"H%uxTV*c R 0ؚ|[*딥rXl"u3q6,[x{3^^ cS+|uM?okrU7,9\ȓ-7=M`dp9kRDe תX6;yj/Bgcp+hڻP;&ubkYD"P4X`RY#FU-wZ^w_{l"yv2V%X7Q( kKB^ZE? &EtZ%D߫ݲKMMd2c5ɶb@ mZQVZdVh֥7n iL=KR7ϕ[Fy0 W\Ѡ! ||E\]S+"l0nO3{7e|~v}Jn}U<蔫⡠w+dM.e2P(8|o42~:atv8<>#,-Z\,Q|uؼF\޹dEy.wtj=C'u$Q}|_뉞?X_iQ;g0n)nrypҞw4ewK}Z|z0x]fr?Wk/8ëUpv0!l0p\FcblWX?~g^} jKy^["usK뚑AUeyˆx(:u&Ggˁ^9<nVe'׵hŦuG#/m6X}>9mK>NA]0̷<1)C[nY'~l8_Vǿۣ7^~7o@ڣW{uG6ƾ)$gU0cn-hvb~1c?7$Bv{MҊ% ŢE]J9Х.ZO6gWH@@3٫-7Df5SF("I~fXflU h#Ɛɰ PlBMɾ`L'*J VԞZ ju%J ,r6JSpx$c s{E۹soA8km,2Ev :npt Q`ܲ.fʚy2fɕ>)U3Irla͓`} H+޻><L'<n(XJ$74J&<44k )Rբ20WUb=t)YQ*&VhFDA6] (*,9Za7W#Vtq_Aht;̘<}XS2^o.]ߗNja[M| CǂzP+Y3ۨ'+SIxU%WUJE}«?a+dWrwLi5>͕Ïbƚ63~ ̅\FD"MJ%Nh)|d§OEB`ZE1BLiJŰΙT$Xl*+bFA{E[&gVEth\)`Hч(73/;NmoYt&ΆqJ}C|D)Vdf]LϦq6?M>_>#؈fLF:,l-ΆeE"iτ TԌJۂ< 89>L}s%{>I!Hm6zTL0T(.1:N+qa2N+s_vgڱצ6^Gi<۴R3_SȜ@(HJ\`0Abа%[>NaQbI+e"jc$"lv&%$#vzDձGTG=-;GEWpI⃪*ktD0n+ XB3UMU&,C3!h)`12dXq,Е8w{O2?~q,k:;Ӓm//~ڈP2 8ԊA4yW Q(<"*CNQ/?L;chΣT9q;Gnz ܞ`Flok?Rv-h*٠c0al Hqm{Z)6ޜ'X+;(ILZZfنT\lHI19'ygI(E{i\4y1;',_E ]-r çBO>&]qr6$쳐Sq2HA B_BNL.NEoSz|'y0R3mOa(Zb9eK[%-Xyb{젲/.֬+>kǸ0#~ۚANN͑^g A46sP a{ !7Kg+|-=@y<#{2S-. \Nv&#LA#O P#K VN2}sDBC7פhϥ> 8b (oSMjќ3Lk\aalF/j烕pGm zNDa|gכls}4YKᓆ]uym~*̐ұ4WD!t|* C K_Uڞˊt'~=!xDA!Y'x2C6$U %-l)[ D0)Kltg}Y|JtL8ɶg4h2k}- eoEuk \dǹ1њ(62| :MnQ5jJ9- ׃4O}m} 9ކKCr '(RBhؕtUAzRv9(`P(+!!ftVF sE$QmX%9Fߙ8cOuM w9-4/’ \)1Cu=*}khhکrMr:)ޫ6.:@+YeJ(=c0K-vfƂ{;{*(R֒ Yk2;(W(l ώNmZ/G#Qj&`,մFBIqu(K2j2 c;L lJ^ߢ|K&&EH_TID YY`Ʌ09TXQ{Z7ӏZyiuBK4:Qr8wP)sG:Zq6''=.~\$]ٻ޶r$W`;NbǶHv}!K%m @USdENĬeV_r[Q ZLw鴶]8&h sV3〈3Sy;A%I1+)cPTX*0cKk-Bw֖oGv7ק&hȶƧ3yx2Da牄 ʉ0 0DoYnL0LD<Ulup:jy&$-\q&E٤h܎lz\2˫)IH2's"4s׉ϐrT:pU`TŅf&2etVcGWQ5VrXʠHaؕO {kJEٮkRv ;9˰[Levzn/:C'vm2rōI@h$RMNńO}(uOz067׎f)Zв)C^}$f}gZׇhF^H|usY NN/{&+\x5AT5o n6|C2+^ȭ͟)6ǚ*d͍;|^3>o.!v)`ZBKj 2+崋.v?+vGG=!^p@l(IWJCSm8whE.+=3w');ݿ_|;˗-"ӛj3lopŵ$B0II!8b(jNES$xөy˰:y" 2Z`(3*hJBH0XJBԀQEr*gf ^;N{\u:W9U(:"g,`B-2}H)n4i*vTN*V+-|򫙼0_a_v0b~;|7e]8[oŞڭAg'(!lp vV.sݙد~/*d;B >H 2 rVR_AO |ajo^gG_YeJYͻ `fo{U@x8<7UUd0e]]Lz2?6Ӯ~ypyFpvjS9Nm{xE^aI+g{I" O7]l ybL|Dc!W.JǴG@#h®HL@onv7n./ Yҫ}iӗRjPP6d p4L54B 3AtGa78##$]>6.1|OA4X@amsMt\JVޥsc`!yOÝL<SJZO0_ç#SbDJCM_[EU"Xp-SB9 %}4ǣΞdZ%s9軱RѤ '!t:'5Iy}?Ћ/U/(J!p_ \4p_F0\aXfbRbQ]}(n eލtF:[JmKjH_*|fԮ=+$bZI\ c0r &O7Ī:2)3̛^%hfjܛqkA)|Vª)=9nԥ0|m𱦈( `6LZIWuS,E"*ֻ;rV ]&~էH;KRh5O^*Um:LV|֐%$@* rD1c-szGm.kAS.?d|rv^$(%rDHWHBW:cs|v&j D+L9=Ēy" s S)T0e.cqC>N,S|{I22do}katT; 4PK53 >23Xp}8.M7*2O{!h2£U/g!PJ$=:ngbz(H2PH>[\qi<$E"Pj (RQB@ @t#c0t0f{d+]MQHZ?0qek (hZn6v{~{׼2QǥO"̱pmvs([3̫m¼ )JTfYCfR>?((+8 qbRc6(SX8Sޅtn7ѿY}Ũ@!"%xqP("TҷXklPɿeOQI誚`$ў&gvfiR*/nF̙xh&4ee7=BfgM{"'8i/&]^k>}! okSu*KPRN~ԖL&e ';>%EY)6)}b#`!owOS4D_3guvx{]pM@ܼ-[.km5@!U#sG!r3 X@ -,%AtDIw#w|=ikb˹pi (S.Cϸh9#OJq:,8mg<i٧yPSׇ'+?%u HTVz `l l2 #ţR3:f;b·`_M /W69w 6͡ع-%|{7&p/'(X/2c (+`2t0Jq袘=^#G^x ]\g՗Qg`"2K HcS4C(!!1/FcB)7j[;ͦ5Dz"U==xz^(P j>|&)<=0(dʢpcPݹns])Gro Xu &.G8`:Ќ6#.PCٓo-U鯶ddv^婽eE>|!e7<#kCjf߻?-K=,>O{_ ^~VM}57 / 6?$g.W]v&?wƙHz_3 3 Ŷ dt7v\0wv|scV{;"@ F/pJÃ&t}ҥP O^."`FXgzt ^Zv 3C{pq_1מTN'UW'¯f.P$nV|~;|ajv}旝z+f2 Kgo܂]=Bʪ68puzzlw%89O50JH5ߟ@|BlujR tX~jVpa}*o^g0BW׫5%D5nʯ]pR93x@ozU]sY^.\| >@{_O_/K>c`ۜ:bcyG3lϛRobeҼqX޾-ZSTw_['ptjb w MX5Y*zo3䅫kpy[zYW<ˎ g?ĂUkthj}X_.bXϟ:z7׏<Ӕ$NwV>q[@Y"1Z(O\#*t k;NK D9^x^/|3 %c|y W6gp^ދ:tSu zrqYوIa Fvrcfp%\..\Bεv )cXsb^zs8=.ܿIq6 d%JZWiz|Sʫ˽-jUr]@MwBZAv^qI*VJAF8L *OGjpT&o ,ZGp Xg GV՚1wT);g}z{j:l]<{3Mq=Sf^3gbI}&ɧ}E]g;,8PIa'0B 1=Ny$;I ֟\p:vdW%x^)E4#&]=0EzD݌xNqAlL`"=oo.}|;]h7ž<^\63苳aok}-RiTx$%%JjPXBsZJA'Ʌ"\i.; P[N,O.ܟ$mMq̖k=I.gLD£V:(}hx$i9.~4QpL^;^uZ4l[p`N=u֗Gw#ԅ \Z g~7 3˄`&:qʹyfև [dGM Bz3p3MrᖱB̑'U 'K2 N8(Ԓ0.SI'zPo甬 t2'9^xW1oBW_%EzT`^*U}Wjm&Cm;)IY)|9}^IC!ʒ`mD!0*eȵ8' |~/f|g^B訬wvʄM@h֠ ¢*8)zGP Ip#J{׉IWY:Ba/||(RzWR&}w?<%E`-:]E i:i )O.SץvHu&f'`\ᕐHМmo[YBܶ!U8mY͸K(HNGAD)1I󁵱LG픳ʁsJ;Ew@qc( ᣱ&A JO6ܤ/{6Sg  ]{tL?a'h4/E>$GZ\8+c^8O 蓁#aK6fZ Niב-E;C%1=0{Dy+-.j4%ih]鎶kcp?YBK,g 3f.Mx'x-3L dZ}=+R^ y*{ԿwW "{:l; 7j ꮥ _~z\oَ'W^EbzQL=+oqxh K=+,-̫9=z׊OeD /W8<g4*LΏN'y1-}k p%(mCenu4o.ʏjp\7P|߹gs{ |I&q5Ip uYBF˸)$QeCgiSKInD&AUI !#zj1!1}i O]Sκs?E ],Yo!+.4nAݬK{/_/_ B3gĈ%g?Ueq1rc(Q+48'eT:ep a*_ZER4(J^P! n5c;[{*KB'-e69S>hJ$$0;%1.Y+ }&o=a[χ8_^^v'S!=x3ʝȇe$'bpQUY.xB""khW!kt5CXTnYn;ܥ~tr-C=ҏȁ*] ^`N<[+;VNV['ƀN[5WIG'!08Er3&9HSKWS@#H4N C \ո{d4EKw֝{!{sO迚Z:#xP" eY EmptڣPDL˗L\+$kEG`d?XZ_E" ֒X2‰dMFˬZ^tX  a Ց$ϭm?08|a%{'o("_UgF$7DNV d$V3z]C/O4pS "Vc+S^R6c:!u-?,Cv^+RM@R@#&eːFHf~6/ϟ.v\斕(V5l.o6L>` m@Pd oNF \8rpf38%٢zV4 \Hz!ԁh -qB pƗeu>.f׉n0N\.\iHY™}]v; Rd)@cAhU _74jrhSJ#Ҙ҃y͈u̟/<>x{ V-:s i4<9-7s{kEmU6㓷n_giI4$Λapmaa<)mqc 0eƌtO68WD+, TU4͆]y].̍Xb; Q iLpyka..]{]l@y vw~H3I0 x$1ݘղ&fH]04 E3$Dkp ٤l4RZ빍wm % 6C!@pdﵱ]B?-P#(Rȑ4?g}tUx^ :ѣ#s&KURL2- cI#%a0E-BղըknzGLy{Ol,E5|b|F=XklnRRW0$ǹ@Aju%7iQsfͭ$G~Ze;Ԋ;c (N@ F0{, B pcD ,Y4Ly!f"p0o3ص^e&/c>M@Ow$?};O4@K`L ̯D.ݪ꾢\qXt$>` 3tny$Xp,8\55]ҽUo]vTfT-w6iMw{ub&&VQRX3@cĜE NV`>}\ `*ׅXWp$Mdy0>|l雷3 >ї^]6tr}(Ǧ}~.6s?7и)qY'`NG7Ip:-Pחg/3=ϯ='"\X%*wun[v3 MVWI+iC4D#Α&g\Hs-qABèwn=xa"FΜk-8ZZ"aEF I"b "C0!Jc S띱Vc2b04hj4BZ"jg˅[)I3XoKYN_Є}Q#bAw&d&24R4XD IMҮs,@vGEw`Ki-N;)J1,,6%1qug7 ݊>ʀ-a[XG)AeSv7&'mc]LF ZWEIy#]W=2x..-QRhlw` x\H☁ !fɛJr1XG ks7NDMacWF53"c[= KQ6j4sh,OrQ2IFHEN($0$:łKsI&fF  ^3jR6)ٕEͼ(:^xZGqD\Z)X`8itʋ"SdD)vDsCbS6ؕe|(wE*e/Ǔp^Α[]e'Q ~|ݏDcjXN%yue>L>8E Ep¨rpuθF`Cw~n eREcDB>Pn%QYf#QgepaaB&( 5!c⍲8`1waʽ3TcR0#4,^X] /7iSԞo~ݧPo5't:aۭU٫OdR [ث+Ϟ]Xv駴+v: U7/AKFv~vx?_XazQ"TKfab]JzV,Qe|;3 3|fS <[W߭IU'/_xvuT潒flwծmQ2U͞Tj4tT+= o*JE ]sF:e p:Z@ 溜gЧ;*ɪVנ cpj˗YuSLK=>ei㙻7:LYagȒrİXrMpoDS6ժ (>CUT#ZDW W UBUGWϑ4gʺ @,9â4 !nЯ7 G2eHl*J&Ϯ?Ⱦ~_ GGYMZ?ԡbTuG9c c*3J%P7Eo_7/ST tmC _ Qe )a*-tIjRfe7/z(_d7rpx3eاIR_YE^th%M!]U,tk*UBtP3+ʙmf5trJhzUtJ(ug]=Gb6=t>Z]ZJ()bU,CW Fm+@˛OW ezt%T`[CW Jh)k:]%y:zFt%L'A ÞM3EJ" GflXm}|,I]tg}ʫM9 Ƙcdt1P=ya񚸽1찣Vl4@btƒOS-3ol|YDηS] 3׀٩㙾E>-J%@{ɱ6$Qs-9˝R}Teչd#u{+"K`RGSJmΌ6>`j#ɽF-Ѝ#K(o~Q^OЍyέG[66&ZDfmBͷ %"ϑ4CCmHg ^M"T?c쁔.PC'N:I:鎡/'EZZ]Ÿ5ZBPmVjrˡ=$%~%{b\ş'{l]{ЕjסOj]`FQk*-tjt;z>tE pXUKY[*tP*3+$m05trJh%k:]%wt )k]1ouJ5޺J(vOGWCW=H}r.@IPKˮߕKPZ-_S_Syfɔ2WJofTs&ʭ!wđ` TY.lr=`cXeQLï.z"ܟeӢ҄ QU*O_ da9)ʚsr)%>b3ѭ ywJ/̈́"Fj%Q"#HdxpD:)%^'"KHXG"\JlB Oc*5(K)µDXȳK1Yz,+vIBw9LչDǹN?jӣNÏZf Bo9o2C{G!k.xTvo-Td}F*_u wi b&*:]"%ѱ9̰FwZI%hD0AP4@2AF>R\b3囌AVmNh0FVaʙ P -`<Ŝa-r8*GZj:OGA;iiE"`P`i1vVI+1閌dq k&)$BNUVO{ 3+(/Hq?lhϿ.".{=*?iP xsB ay&c F$: qKIƅoʌٿ7C.'eT1fCG8=N5 (F"һ@RP\ό.{ y/ t>{6_ TNp.:mqkrTa\j$E`a;~vEHO20%z zx׬'Dh9p2VoA5 Z4m~\qSHyjnJUZfogfẟ'/'0tvsn8<꫾] ?LOLA(Bkb >kb|}M盪!|c5чbXr`4弣WuWe[[+A:oZ7Ubণ0בT1pTZ1l28=V1V- UazW|ׯw/Nק/^;D|wr uCo^WMۢjlW]#JJDzs\)Ȕ~~~yQǗ'qgv5{]L3? u;͊驺bCo}^ԕs_b7C`S\<.spjǸGnM(])}65OifKT>1c(IEYh"=2?ݴ0ud *0wHϮNj`ȌϾ/?:G!E0Z12D^qy^ :ѣ#s&KU"pdJ×(Фjʳ/߭s#dcD BHagl/y{kOّ;o惞krSܪn؈z?oEn"h],PY>3ޮ4ZzۋYVS{_b'e."ķJw):{%kn8V6f>,~l{ūnWs_;[[*S;lJ)ƭfb\ZwN||UT lxՇ:௮g͏8|ֿuEgWR`ef4CQs7A0gE3jcJ6ǕԚkCI)7ܻ/b\Jx1kb\A\ _RN,0vGdy= N[] N@QENPAee) -g?8zj.0[yT$~+7yve~_fmlO۷e- OOׅs)i;dOäQϪm=O9|LES:h萚#z B>dVU7輷 x.>*"t.'9bZwtmJ B熷Юo%(e}pY əwrdwB) ks BJƅRމFhr9XTW`dW TN̄N[v5QU ԹbN6'! }ݖs&;O}aإF2@F(?Bh=$ A]]YsJZJR`쪂NdZԹ5, C"N1*(r+*2;ld@a^JtC799*1*yyt|ۻMωi}~ 29WA5t=35y7]6\-O3Zq5C I"`F<cb'&2hjDhN9.V0b%y46.PG e&tYkr#2v[~t$X[h;B;“‡{}ljnӯ2nŀ!}~pmm)YMbmG'&$(d!Tڗ DY1[Rvt<#ԗdElml1HUj KDxk9-6żM:ڮv.Vk 'ZyM^e-F4ѧRN$z Vb.^;aȂ klҲJCJ0lңUFTR1r9okf/"vc-lhGHzdЦ*dN6eTrřZ3glEG6]-bTVSA-C)¤6GJ pl-gEFa\&%..vqbuU>bT'sʩjU[?oUvP @IލvvRa780gO`oku:3bS{E*E< 1;E?>i):m봽1O_-cd΀ft%^3հU4PLJtI׋:FXVo+QSCo^rL*`Jh>@2~}kdp6(iPh2Wq+p+:;$C8*\u)b}EG{\LJ 2O|=-;QfƫߗAT Z Y(Dj5 qb΄M &;,k JφۂmP4BqkpwrL  YʙqLj2h|F% 6Q9+2ygK L%͖䕋KgȈqt]v0g0)*)N[5 1B[h mLcB}i/*\NQkKHHO|$0[mkvjYA^S4%[IE%@i.rsU rBLJ%cay{ѷ8vUz~ pd%YΗ*+B_|A;95Ώ-ۡ4sj f old|JR-`QxDȣv)a4xԳ+gkUQ@t!MĔ"m UrJ[gUj}b 9;F j%yyegNŅJ!*';bc@bY,K#*k-Fn1S}6+dҒI_#PV#9+Z0' YbJ.!#"k8a &"*_|Dq3CT]DoBW7@&$9- 9eUG`좺zw#cYvUUNq֒o7!|ȰMHU@lr`5Tn?sN{)08zHN!WYl1jlpf›k`.YfPGjiM)uPՀNt{|0:֦*t5 AgXm9F/@/a^P}lQcvS5JȝK. s<vOf oeUjx[;; 2zovNL0f|n'\5U=|] ?3-fbom/32P~Y3Oz|Jn7O+ccgχ/5|[O✹LZA>^v ragw'_Mс)fmַbQ#w?w#w;v*n/@d  A*P*hMPlDREYoS#1ZK?rzd ]N"|dahv=q~1f Od;|}wipTfi]!f SCʩ" TRt{H("\jpIW)p:uf2p%CkNϭM"ulE~m95x#aõW_#s7s+Ffy&fv}]]?9NciJ:0!X6%o`Qi/G[Ϸ.Xګڃf1S4&Q G6AQ"PPQh?.b6I #o>oJq{7׸9rT~Av=gξ87}-E҄|?XehGAi|tl Y]咔*9j\&C+e4ZX&˗Vb4 %& 2 BVO&eKMŖ6q#2/f$4\庺l|6pu[)ZˤBJv`8(#QHClCL~i)օS$XȸK"xyLL.,TM# V1R5.p L Ep V ')}?o16a;SkXӿNMg򾭷N<1x S"I-J)sVg|qkJ!椣#!J xMC 狣QgtӪ7nACGq㳷zV|dށL$ jhgi'J*D@j]dp{F?`שŀn?  %ԽHؑI[/Հ6YhaUtp6f4G҅XrE)B m_ܹep!7L5o$.rQck~̓T=?| f5IwɵEBnj^09-_0==7CCmخ 6Fr|W`W;4iר"OΪ̫V_7e6}Cxzӣ .G`e7 @;bhK|u 4YPήm"`~kC:=J^#NmT{`f~E-ʁ zOt51a?;˼ȳ 6cE%hӼ&W5-sJN֠#xoi) gSeҟX.Ɵ2!&i[HRrVo>x5RmuR{19yko9ݛG7r8:~+%lQAdIobu2j8}/*l޺3Zp_c5cZjM?rKYNnw<oCQ ˴cѹ@=DP\y.dIIƺj%4e //Khog\GSp9 )j[ȃgt&-R!i)zMiz^XguKhS)5^ aF>7zLOt:0%解*uV@byḴ4^e:oL![MFr?.[-!;{Ebrһ!zq'"kM놵u$wgS<2r :2!ŀ[3dN^hl~Qx+Fe'>Q-'}5>ghm }v#ƎMO~ c'RIkk9^A jN=9,jrğh|)tCST#|J`Q)R,qHѮw\X~Ԝf.nNW0[^[t0I26Vν̪bJςxG1ޞ} M4 ?t[[k0;bE9>DSAQ4uRimy;o(JmHenfr[-yoEߊLoEEG=L8[Y" IAX""Tl% (s#;#Нd*b\e1J p&x`d+F1sYDW{}W{V_.$P cSJ4 Is`njI Qwƪ`]s0ǶguɘՎ{Nit9{ܲ]o]ŮZ*v8?9J0OR'M$6b:i|3$&<R`ŷ-8ɉ(ȝY\i2=xQ]18ҕpi旓X|ߞtfnhzqUv|n*Mt5CP/jN/O&;9M S~4|}p!3_c.T6xFVUaOl%jځϚ5NT߽<$5ŵJS 8Re9هz=h̆M㴟·L=c52h2VvXT}IQ@eL}rܜc,cnBN^jggakWIFH*7٢PjY`^Ir'Ex v6xA['R aRs)h64/gK_|.*Hʵ>~X!@@h% "hI:Ճ&Y21&y\Ͳ*͜JK60JH"lw(',C9\q̤F-zqFW`O/E\*A”櫟&'\׈_4x=6ԧ}w^wۼ6$3|b8:S?N/N_p#nxE3 #GqI]\A]W~b^:[1]S'ǿ,? =e$UڵOwO󠏭ᖧOf'm U+(эY-dͻLa|z6hׇͫdefm׍톭ej,l ftt[;8+kH\NYM##q:.Φ/#/j]cx/J96˃,${t_O;VȽ凉0iy\i\aR:Zs@%LvE#jɥxnb~{tWヺ a GoI?qϋA^ 7 ~_ݘzUVyS>ӧ||89=Vy9#,po (C-3ww=r3aN?%k.e쪒o>{i;]ߦ4Z:]k(Zjݢ4z'(|r0淋-lpZhvVƋr-*k'kq↯b赶|Y˙T ^u(ǥI.jP`+)o{a8ݴņ=f3|2=tI!uw .{Aʖ̵+D:8Ғ̃Z$_עxs( &ÂJdtP3%Έ'bmvzM8| nd1G)O PBB1IQrT K]s`s%$:8<؟.Uh,g’SZ*jn9adH8QH.Kw*hCksumޭp9K(SR3CIU/J0) 99Xq6 ՒPwtI+39F(JI8E,[Wbi=@;4Cɑ/rGr2O -]A5`C#M݌7XIhH&!L"mm RҧR)hY0\zX˸3NC "1 0s#h|#QID hyJ!:Kil{gL$n{WܰJ-|5X+'6+YVǃ7X ]W;;&FU=nΎs>K6;%s]⬷Ng/=&D懤u-0N),!R P $KH :eBd@c=q~1%"CO,%VX(;KSvDv+BD җ9t<#TϺێn)=`,C)9kD-ѧhg]+u r0]Zx _3@[vVh ˽e;VEh?g@ @Ie#kvpqAkOj:& EBiv%&@tTDлq舌kUƢPaVB]$1dlD(!hvĪM@k>ZlOvXI+,I£k4ZItf%ݢA*5.-z!.>i,e4 %l@M>M-J UK FA;wnse.r״t{\GIDm @zLqt*D6IҢJ$vh+MIlQfq׶aZSQ5(U/ y(Y=y`4vMƤ7 314lDŒؓUZADx آsE5G?T˓SmP;M׳%\I"uAAvP-̂E1Č*m x$_8kգfR6v"+TOn;lE]Q("Ndvi'7d#gϺ|W>m 2mYVD? Fxƒeju?l#rJ'_ek$7}7A5TzE's̲L"d3Ӂ~je+EvAkк)k|&F/N5^~ݬׯ̰߿dp1مG[%秤Xi#0WDFD7^ !n1W?A,y~ŝ=mVԄ#z&Ϣѡ'[eVܻB]Ҙ܂/Qi$dp] aSrV9v(U;9GMQo<,Ye<$&Ax;j'Ffn裴U 20n7aA WRfurN!Nh5d:$gSv29_ HPYyh_[ ; sGN3d%SW~φ\m*V*=C ɕ|mgcK/''fs3_*\YTES.UP] H,cZ\ h8ح#6>@Ǖ}zArlty9HD^! o EZvKջӶųam^=3zgX^œʰQQQQQQQQQQQQQQQQQQQQQQQ}zT- ң_HQU9Tzŀž@a@2O܃t6f~9Re?Xր9Dr8b&9Wo__qu7w#in_lS> ϻVa=v )XjTp*.e%%W#gy۶|u4I*W7I {;7laC"rdz_~Ca~W~"I&ԟ[A=^_}QZ*&+bUxCYlxŇ?2^Vo(1\Qwuyn=vd]O[\?Zss͜k\39̹f5ss͜k\39̹f5ss͜k\39̹f5ss͜k\39̹f5ss͜k\39̹f5ss͜k\39/ZGK\ ^ƗùQ+9װRG\kÖ^gыM3\ױz+KCWM F:DoPBP2AR[ֳby~|n L`}ϘQ}QV cTOR&W?J\j&W3Lfr5\j&W3Lfr5\j&W3Lfr5\j&W3Lfr5\j&W3Lfr5\j&W3/FxK"W[Zx9jgr5UٓaQL6\)>ߵEY5!Y-c 4j?a뛡4pSOzd9ȯ"k|5_ύMxKؗFzG?tZ\fCF6*SrNUVH%E){at괣j%:$/ܝmb?0QTG)wjsՍ zv;۱gqT8؀-<mỲ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X;3rk=z;{W^Vkk]Z;W# xhţ.J=uPᅰlb#\4 z]3}p^wc7ќG7V>/NQAxea>O}NZ)aszJ/VWm&W蹌Еth/{VV%ꮹI/F/r؜C)SŦ<:}ylro/'d )C]W15o7 7n3w M\EK($\+Bs57Z%:*ؚjtQI[jtť ޔ.90"wMD0ى&]#ClNTc75PѼ|S;/f92Ƹo FG]rQc1!QweK|I3&(D͡Vt-j DJպd !D}D~b/! 'i#1<Ni^VD0̴ahiswlP\>''=-~{xrqr+^: ,sLcj8 J,:ނO%ՕyǺ]?ƴ1Vu-s=뷇s {wvڶwÑ9p;7A1e:'PD gth2)ZWC3Ԟ1"G|+"yÏ\.r3_LUki! KUmU*|]5[+٦2S1_xPVgKt!+捎x"5֒NOb \2 KvHmZ_{{m>]ջaxbץb# ) ,;4}!m1.grҢ0HU`C "!JsTsʿ׌=8l !\ٓ@9w &C&ūdk5PQE7g39R~\hwpyql.Aty;o|jG1'{'j$=mcm>~8$&Ň`X#^_D@4a#K'wxa6Wӹ\18 p=4_|u݃oI1+ju`OwbW|ϯJv;Joq[w}ܹ_^zlsX\3ܼ1xO/稆gە_wqszsIĕ{TosNd/93JSw &:7DXO^dw]s3P~{w*U5!; P 93EJ"1Y[H!7rv\O{bs8oƎ}rv5oveǡ}3tƗ:o7\l/?cw(:ს-v/9+CH aIzH72kϡ=cfg>uً/u{M+9WpgDHwkD'1"gtG@\e09!4L Xn!p+ey0蛍ϮXSmT:r*cV"kc $dos~2>Ny=5ٯ#~Hҗr|<.<^nHZ MuB giTT%Q/STrIA2s=%.I]9 ]+6@^B^)DȼKsc:cb͜2`wN7#/oc?ژ;p;-0?pখ&gn{^9djjt.nԒM= >A\˥UWQgPK3OY_z&SgqBM㧲V4' 5M'C9 ~d8̒yrJ~N%'/>ӕ"D0Q MGtoٻF$W}%y5z{v`<"%)R%j`FDutU13*+"8NDR_Mw*mG P}D=ն~#Y /PE73hx)4 m2̔Ƨ@v .Čf!P McߥVHR+oY+ľ}<&Ybc?NY, wS??J Do6& O&RtJlx! /8|lYh?'a5t bא9j59IF,,:^3#vP9|>9=-4//-+~7gAI,bGC9('x6=$Hj@N`5[lɟ.Me:[m3bˉn]zM٭F򞷝H*'˄O#k Bt<"$`q"KM9 ˁu$wڣ);$ॵdsȸY4 &"YV'W=%>PU#g[ i[Q{3稵Pj|!n@wo,]j a{tBarg_~5dkc&p^6A hhJ=8z?78h|Τ`QOQwX'h1V9LLiAb$Ls2ν̪lJ'LB/Uw޾%f4ym}~gŮX9kmo߼}c9iMOR&?Wq񢶤E )& CoЎco]{fe$de e f<=Xù\#JR(IJRx\3d `1@ 69cD͊z#Z, iiypp.PbԼ >zBөyKȏ1@Ƶ&RU)gujPЀ5Ԑs)d@PP @ bO@w>UŸ lJeOp*DÙŌVHbg %(5^s) ~>ff9:3XS+`>x҇Ds!% VC I10^ DV' jjPbJȱ}fco(~rxx f]_vU«n.$H+ـ6D\;FkL -HjXD$-JuYq=v 8sft^%sIK° <: =2G&wK.r,M$1JHXD6r5$*`@oqǖqp\("žũ^ZXa(Kx 'r~ ,+$4h}Z%挐ga8\[vM#`!d| (΂4IpZ/5^ .BH'-Hߙy Ћ^k .WFYI_Nh6>hWJmY#ļQ>`imap1y-Z#>5W#r%@)!g8.{Vʝ0LO}t?\WZS&K f:$[`^pvܒ1PSLp%hI6P<)\3i5k7#aȜDf,vK[hA<S"\&>a/[OgVWV_7z߼9>zwY%53OLt5Fin2ĄIhg}GW^9<oz8 yCv \HX09e)<(C~Y=F*!7 '5l8]1_~çO_O\OӇҮZs1t IQ{ww?o?ZS|-Sk]s\kXrkYʏ/|;NNc9qZjOa݋a?Em ->2UVorCt35_Djp{^ IjU% T17V=X~R<,ґj,X4%sA1+txGro(=a]V|a˒2$6\'<{A+ѠU\ cJd`2-cN*GZ'v*GQrz1;{״etd\q* W.\ep"g'nٱiE#m|y <^M>/};n v0pQ=Ț9] ǍU GP9JQ42U7!7k B@W GCACwfXD)WU7eqP̣(UL^1f&DG\$9sY& qscb Ymg.:2{ojѤipy9W6Kޗ`Ӵȭrtٻxr9p]{y|D7ޮzGwqN/ޓ!i?gJndzrNl7c3V͗f-vSn~]\G ZQ v܄+@;&1˼aQ@Ʀ,.*-6{WQmCed'yLii-h3ABcQ+T@ &-^o=VY2;w)!Eiꔜɜ&9x'SA te`5r<ߨ#z@<&:Q#j'Bb"%mYYEFm8Q"& .VD ЋXz^AP*-jsY%XgM TQ?ڗj|ڛϮ(Ď9R^CCa;_gtEz^-Uq[,E3gH*!'OA9`Kפdu!&C`m ՃFL.1ٜL\Ϥjkjܯajg ee]({]pIQy󋂌;j.n㻧W8o~g74~4\\cKI畕 KUDHeE'YҞ *&T$-nCxB؇L_,ƞOEmJmhL&ێYW38LȐC&%]) 5G,)̇Ubj4Wևȹ_FiQǮQUֈ׈Fh!q3h EG'% Uq2t^}X;vՇ>4'Pae$RiXӫ9{VeU7ԛQA[я/hяV]::GߌOøBƨElȬv j 9F0`T񎮏w%)ӘLXJ 32lyd2z-ҁ6d .q[ yH\EE3B]HIՄȬSDK2еȹ@>5XY4U)(i?,Ifڅ5ω0Ӟ jW ՊxUb޵q$e/H~0w.q_vcdHJ~!EIjH -ҙ bMOOuף]U}Sb8ZPFfR85KlEPG]w2.Ӵ20>н$ r=تfd]+QJ%"r1\[XeSg_]_:c@_W-MC*l_tepˍL]\*ӰjW|UM*ZluBi<|gq=ϗ%~ͷ.(EBW`=׻HB&u: smGUUb O޸}Ҧ9˽&Rr})Uu1.]b`B'qXs)mc`47$[!I+, _'9')Dϗ<65mԒ+'Z2Ü#3D 4o0JksAڬ-aK׈䤗x4 Kd-f2Y+3wsY_Au3fN VZrjp~|;[vmH^@sJTI ,#dvQw_u򵢤Z aEQ4⫟߮{/1S>|lU{OEv~#8a쭹ZNF's=-K!ϊ5l_!,\15cu5 ]fǀE$"XSe1W JZ:Ъ6jǠ>BT3j=-Y]m.U3`_j0!Oӏ__rqA gHX[8|*Ezi96X4]mh]ek]ZinRy/S4~"[em5-IkeĬ;7J7rbп'g5#wdP-¯>=̺4'YUKSչxqh@FV{}`ֽ8e\#f݋¨_:fr#iT#{etb# d@d) &jL8>\:2 A}uK<ê,㺸A1uyJ|Xws^^_  g\jyj}E߼N]6=wiϝ&W/HԔ*<*a9xy*'p1]<ii.a2Px+)@"_ydRrS"Q !e<*E4ČvO\f_PS@# '73x%2hʔ-]זꌜWdȯL0/ݕǏTj2h{'ek!=N' Z/,&{UH c)5P.i,K^NAry,yesX4<^8j9gŽb"q\m)#N‘q}&K%~B :QFC_-;ʋ^i@h*De*I( H1!C*(t 4p~rv}g^.qxJ$R 7Py_"Z$hW򄫣UXzHr\3Δ%3 KYf@kK %i1h;ֳig^٪AǧѦ1ۯK2EuFƁDobL6EyKzRn u ;'&io]ޤPS$A FmJV׎jubtJO{?yigzjPY < JH2Dւym3$F½L` ~w1TI;SK&}~Ytj+ٓfQ\yg1%VXϓu#]ܔ(_W"rَoUz0b `34D}Cw%'DE|U4 ^:-K$٫GwceL?vt&!jG]34v}= kpK?oUnr~ * PV }3XtrV ҔS iiK8'>SA詅4cG I?wgRJ}(bxS gNF+#3&vsV8%y5ZO48-@Y"1Z(OGwë|j1팜ulЮ5x8^>U Hl1)>\ѶD$m> e ;3@2OHmNKl]?.ww9?;g[ز}AϷ^JE-wl2nOvmd~2NP>q(jY3F\jmbnϙcR'AdĐMϭ͟7q܅+*-&{_b=Fb +VjJ ݏw 3v,ߝTE%F 3$@i4kI \r uR.j3EvOcǕr:D<v5Cf.t׋e 9b_uK5! iQM2y&Z)IIVt0IZ h>UE8tR**ku3`epٔ '85c2(PR ]*wFg}ztغ~v{<>ҕl+zrhg5G*He,1I:)m#fs돰̘,E~1jQ:%+ }9u`8]GY')1TDE(QR !r@g; Ps# GS,iR>EG!(! ß0WG<"*>qEeDG\ޡtNiC !;BxQhH$ҘMQZsVKrQ<՞J1p@:6 ӢpWk \6dɇ1FU[<LQ?cP㛆o| l?g6$P+e2"! ?ZzӉp݅QnAMq̖쵞$&" QQrIVYFd>X4<m.~4QpL^;ۍ^.hٶ cDqW#/uaij׍Yf4}f<$IS:&Ro:B qxRC!BGOf|3~Hdb3˄`&:qʹ}$8έ J>\TVmY+ļnpA5e$)8(Ԓ`sпЋ7HA79rnV*&e'7.x3m3b~.oB\v -n2O#][HoR&^d(z(AAXLN9Hb2f׏E/4/rR'jZ pCy%"F|S54 3bDK|o|Qgw?d|hyeq+6C:U?nHϒ?Cۙ@vO;%zTJ 7^9z ]t˘*.rZ5rmʹ e%¦yLMa5sXfJN8;[Ncud^T*~% /~`#l:2 +HZW!8HW3M'T7'$IҼ@^"3#,̓ӳ˸G#Ʋ )hK7nL](0pyDD~ OiƓ0/_@)lα&m;Vk<ɌC!cOa;sDHA 62[VZNWo;\?#:vSe :p]MEnm^΋ًr~/36=O<Vox=V^`e[K"9[9`SPZ f ޑLNℷ&horT.jJk@e|ի† 2u,8^8)EhmMYrteI$4ͦ :: __e%sjݪ%z^ϾwqM]`E@9[aCU ECׂaW@L$r1$:b=0;f%2QnR݀<$0IX#gEVzx/&T} QaGgΈV֍xoQ:^IK@ O ~oYzL$ED5IS*?^w5m ($!B>i=e-RQ*ͦW Cb%e;8孵Tqqd{YJu>X[gQlZ <,F^D\hIM".O0x^\ 6ֈQ Tz|Gb` 1?622=ֿ4v6=<6=-39@";58%R up!?0=_<;Cf .9,pLn1t r?p?W)ʣ Ksr(L"uvp'dJtݻë{dp n)r~XA v☷|+HO{4`f'NlW8'qMYO_x?J;zx;'횜L{'T9J4|֓8Ų E,Y?W~x&&Clp)Әew"$")jb-PD( jb)1DASʱLykwvq[n>_1紤")kYBe4Rb;F` cN#m&[ёFԈn=xgK 畠8CŠoavN.G'M( uG5@Cd@e%.@]QFv;\=z"wUzrR&K@HK^Z@1mB61yEaxYJ2t)@ F!Ǵ=5e~p ,r_`R0M|ȃy8WQ$r0k^}X_hI8+qO;Oћn=p>'8-//^'x?_jmiݽm,e \7[̅={%U\gC|8i -p v*ʅ@;s*@SѝJAkƪ3܆w܆G܆2IK b FdaA$rACQ`chM6m*.%w9) rN)HdCDHҗI{2l`lxd3(vx ɀ8#sߢ%wPuWƢyCH I.d(۶HLniY l j'ߥU‚hD6*~l݂t+j>l.zYY/n%fI96^CɫCp2Zzٵ5w4E{ IϷiFS>xtd)2a3,LPJ9"Q. klm!EWR*!X!KJ>',˩|y) Fel6Vif IƶP7/j 3vu]Ip}~vg!_:? OW߹ֺ D:yZʊh6YHֳC%MJf%K 1Ɖ'}I ҪMm,I&WطH^?saēVYo(q1j7m64[xiehIF&#|.IBeKJΊю̀()6Xc@#=dF73*e ekɱ!Dd:;fYo'NӴ}c[h[D["2K:9[)ZH}&M^J *SqG@Iojj Tbdlsv&V:1F8흄X[f٠-qW.rSlkmch{ŕǵQ" X$)5T{͉7Aх-<`QdK}ͤc[{C=|VĆ =~*MȍhU 7ԛQL 7H_)-ُ]N;:G /¨Az̹c:0t!h9aK$r}aK;R;>3kZ[B)k`72B2 Rʵ7d%Ȯ-.^QX=eHD3"&y <Zޒ$Kkj:.SR NИ<;WxE2eXs^Kx7j^q5z`v^$/I]rQ9* PvdґGj.^hhbF횞F팞)m H5s; ""pDL(#]Id}6]5bE,E@w1%+0!}lt)+?2t6jICFt]&0X.㳛7^e&ݜ+}~WͱGG*7c߽ro/\ h ({|g HL/i~Hɝ#L-!t&TݵނK~h7ܝÄ:L }GujOygrGCgM"i0(]"jv@e݁cZHZBZK`8B(R!  {L"J@i @-ʢ.HBl=jTި.tYĂ"|cL +NcHOǓC߇<tVw~楜M4HQ\tḂWB9:k( Y7 `fքNCZt^}Z[_faOwNWc ;Yzo3qWÐaΛynۀ,Ts] M%n{řWo,ny6_ǎ&2m0<C+;oY[k"9[l49*Αz-bE0:2U4l{DДM T!|nH0,LL^+TJ0=Rw.Gg7$Yqs^]Y)㍨\;H\~].M ǵT5э{Z{4{ 3}+?^_/^xs! xoV| _N&e|rt<[Ws{-PĪ6;8LV;ɏطc$rw>Fa+;^&X=d|՘'N9kGUQ7nuj%^[ӳA5R֬>93_eT#@N{!_1R|aN&ˣtrv_7mO^}A*u3pN6v_'GG |5e-[ 7ڹwޡJzbyV>Wd_.?Rq\waϕk-&yay}YkyIX7޶|kf/tW >ބ8uqu7n,܍.rhtUa+Vw9QJNfPD FU Vc'M&%Ȅ)R2 u,zCJcZ a) "&;3:8ۗtbf7[;ly^jWym^ӭyMw!yr'(yrv^cs.y^xd2e&o:tDbFDġ6(\ep}UVR5LBK@X#,P`?%Iiù&o[}*etw3̳S&t}⎢L]נo+yA_AҷG[Bna\`/b[M˗zGWOFsjq.L$bsyu4oğ9NfsmxÍLtOb&͙kk$f=-=.oa2hw@5 À1xI}LK d^H{;Eߚqs#>%Z~h5S8&h|Zh}>Wqght<ɽ{gɚhX֗Zm; 6A~'7z*`tzX_?**X#fݸަK*ڸRkŹ<v~@$t۾OJ@;TىdBs@rIEC4r{Xf1%\FU6&&j!D\]zQB)%"I:god)CC:iBm&Άy}ڽoV [_x/YkW%y O^8(v0Ca (>.wT֢9&kԺY+q^t4 KiEfZr_CV!7QCV߷E eĹ&Mw yx\\{*<-U{M͠V5^F6ٹ, fR6gʒR:1  -1ΖKFlH*6=j+l*ʮ r ֖8[vX-lBhl a;+꽝_Tdb|8=~v9KwO ]O?O%[lTU>hjQICd%"Jd`%K(]Lv0JBVbTcyh man')Hֵi%;N5/Vvlkj>5-240'RhThc-J>3:+,QY)xmlHi̐Jly$Q\2 !梳HLɆ6a{ؓeIj"6ӏm-nl`x/s#toMp'_\ >PvqpLʠjjB[H2/X؄ev&!#-KB JL5El&]Mud8kbʹd[hE3.޻]@m֤"Y k>Qy(OrhIy]܅]{L;=U5ӧGnTv_p?yEAܐLُh ُ+U)t8ih(:վCK# SR" w|!0o8-rBddUqc2 Hڜv&CVٶtu+>(I,uI:Ȩ@c&&L`fY@aԸeUIDVk?OGÚO0> ZWw?z `k/ÀW vGmf $5=\wQ2>kYBQt]$sLo] C\,`3RGŧs6!*bv1)J$Қr0Bh. ߾FM~dM"FD5Q@1k1(!ĔEm*Av Ig_WtLr2ϸeLAg3C`sN awV11 OI,MbXÎj/a!N CI!akU%">K%6P/qC&1;/< eDȼL}6e uLT.K\v7s{W𚣽^g; AZ^h@ HkO0AL؝Oƥjα+T r.9Z;Gnv|( jUªGBpLָR=Rj@/ c)C}:tm!T|'l2LsbG!XժxaԻUNC\hͥԯnCΦ_.w vt2AT*B6%TuA* ] d)Ry* H,L."7CuWSm(MJ) ]-fK)peܺ\@ =h +c IV@%RSpmI^:>yf17>}(3Qzml r/ 4`rtQ{Ǵu#X?}3PaV|G_$'>09lβ>J3tI2[KeZEFQR&XH.z!RV/m5XU-A #9),p-Y{|/GUU^73C{2E ^ei$<H*?{WHAgQ@=L4g^hɠ-\vU/o0uX>RJ˔-PKI1# B阼+Qt2a`juߌvvu 9\JbY$C2ʺ'JY%`fGo,r ,B 'EFTbfJxZϤQds!je9?^ΪZ YMJDa^I)c FyF9!-%0338A10iWoE@_X  eɥ59y(e@dn&1PjT+^OzYv'KQbEc AeR[2X'c-! 7 +uc_.NH-8äOp =PSEVAs$2) I1%L^ [!پLjELj*ƷLL' -nnOs?oeCηt/O|jSop:W|eӾRv_{=%0hQޘ[ RB̲w #/]MJB$#` & pM,yrQڐXdRKZ(\6a9u StOcb:ZqۇmY?6ݦ15tf?'^uoLD6Zc4S,{>:𲫘pjs z΍^Gp:7EBh$Qjo*Wg?ȳހ=wmZܕ+ϲgzzV!|mZd.f=H >6*8q裖GAHyp~$B*Tz^oV6UɊeJx@aX@[ ¹(<'~>vR:$-w#K8}EfӸSu>lZi2p 4gdgF[2&lG"j=ce 0"٘x6!'D "]>0 N1!Kõ%`PV0)[4`H,Ř" NTnSR3KtvL:EP;1f~~7,oFqnC;;R%i&NJ/G=r{b5;(;z4ۆLq,z ƘT ,IkuAL2q 2!"pۋ Ua+<] <2wpPy2C2%L+ʈt*7ڥ4^+IsPNio'|z%) ^Y;R9|$pxcǒ[)o9hU 8])*;]ezrL aw>8y,+Wz'H8Ru軟ߓ .grhϓx֦GO݊>-' C .!X,ζX' 7rt~?] ^4?@?y$IgMZHC)qK7tYI}GW|=c~`=42tNL·{?im(jhinSKMiLavZ?#:Џ$gwf-7_?5ˠc/r]7vW=My<v6926GٗytP*G}s> >tF4ʍ7s!*Q >H7sCb>K,v}}筨R;RI2) >d9sLNyԴYSc#nI*S\c{g 8*ŗFS<]Rp^]~x76 r}ݻs׫:\wg ԸVH0X ˖;>`='=ItN:]_\on]>ZYz\U:GI~X8ybk ')iN-଍=|iTvLbV>ERlD Emj.w? r'i"^fb1yo J!٦̅0>@Cfn3{M`GyIW_*vЊQoCU ֙(\ J(CݤuμWMvo$O8|WN1~jp~{x<=clíۉm+42'+M C@c6J.HJ % \7!#/Z\)\#GWJ&M>KtJtu}kZ$J#?|F\^޾J}w,)cy}KrGB^[D`)d.G4L L|?奁f%g r%:}ij/rR)\N74ݠVqr*Mg9ddh.`ٞ {= UIyKk fB d eRS,kE 9$VdAF'歐0H6hsMMӴ Grǧo^~x7ǧ߼>=j Lcȹ뚦飧Q5^Z547ƴw %Չ:"dOz?^~~5oeȧ^1z[PB{GZ [z՚߼հ͍hXn˅Y@Jjݮ;}q.n(CA|[R%!&uL&eHibH2ZJa.]}~GÞg"Ph TĠhaUHhR $D;٫RV\t@mzg×o×nҝ^#K\gw:{~O\VgAi9;>wgC\~(8* 8->emrssu\%L!UK:!-zZ|PU+R7&/%6h>jE b B29bS 9o|WUZ dJk|`w%Vj5ʶCZOrH~+(hE dkui7)(gꌶl]gNjwOFa.OQ`R 285K傩DU  *\")rCidPZii=8ryvR6 NEXdg2PF,7J5%A x6x$yP q߻mg< #0&}ZYaӅqgA [2Gk,waK[o9/"䴻" 2'TI.ENe<*ӄuLZL ]9|HX0b"DI$.h> xɹSqSN$dEH4VDzch9 Ay!4Bm,jICR$Jhc |2eim:e.~2 aɆq23C'pb*D)#8EdLYv,hVwO4})E4S'%E$$% B T #t`Eudv `MwBZ FG:h$$Ym$im0,BEUwVEK֡ą3ʀMH&K֌ɠ8Mq-spkӹǓh<'kϖxv^oS*=7eD¶mjqXx=ϊ#=;tBy =)Lą1\Gc:; #ّsY|yj v`7 U$h1b_)12KѢݼ RJv{_ fELm\p*@J"<` ΑĄ):M'p4B'v"2-< N0u0%<"*>qEeڀ#.{bȾ)cy!@!@PyATbAkPjIΔt eR\49"줭MgLN,$s7>Ï%"pY]n}owwyFx㡆go> d"gL,q.:5N ިC`,yaPC vh2=N4N8ȣ 5"K(NZK>$P+ephEB"*D(}n2{'y<0tv#+U'n 'ϞIr8`^\H2"A`Q p i9OhtPL؎[=<ٱGL_1uh6CCMAmˮ 6I♁F E9 $Q;!0i- xG&dۢ')9*t 6Gq Yg eMti}dt[RLH{X! # &|DIznuaxjI8!G#= gЋsLݬnpY3bNzfB~ 8o6{T##d1v¬Nb0UrJqx<+^_ZH^.zL% gY 8)1=W(8嚩 LlHo.'SU@vf~V[IЩrcG[|[~sn|C-{ =*n0>w.SʃzwqQgXt ]EMoMP{WP\?Mq{PQcFrcayOIp<ᎧD@ Ta^3l1ocVLLYu}wu@so=`Ńi]G8O x_FDdyxvgcA$ e(Kĭ!Ȩt1NkmqN% -mkg兖_B x;G!e&i 5(%Ȇ$.J)TA܈਒zL!eV^'l6JOȲRzeizI6nJU[)OR%O G:)F.cRv h8B!As\ۮNlt?uRQ'%h6H{ $МRSA8OCYKEAh+';h}4b E'AS9?uB ݧ_ gK_;{^d/wjRHbUbr?Y qsLV?줧oSLIPEB:$!j,v"?޵q$O  I.)g]ޮ5jOpHxR,[4=Uu=ur;xWګFylQ^b :=͘(% I) Mr-aʃV#BkbSbIENjsWfe!ڐH) J sEj/ctH2̢{ckT<;ոU8(.v!'!M;d7%d0L鄯]}ś !']Etr7^1ބ\ /CǖulC#ء:&3%!5N. 8B"u RK@[ЛYd;p#|$x2R6sl IrkPV[N~ |J)$uЎ۶$Hp ߞ3d-چ{{L#}r3D%>S;p'Wv΍L}ȭK[Ը+kBBsJmZyQc[2n+txZwi׃k/g3r;=杛[>ݝ}s?oC;&zOF\[ή+\n9q9wo^uԍ_2^=FՅth[ޱlsz>R>iPrZI8fr "43aoHܟxeųL?$x%9?x`TZg+nF)w, rvcD= n! -ɤ{^nD |  Rļ_m ,Lf  8}+h77:y8?;PRhh8o$/^!fu VJȨWwϏ\K9m 8m-U(7_ȯU%F+80"rܝA]?foCGwN9&vȨk|:jwlFq̯m QQ#0x}f-jJ?u _*2~3E$KZ5ѕ,^8ґ ,?jqQi|L>solaBǯ?{:`t>FJ !8hCNQ(ezҹxϐ)3*oܨ|SӐ%&vl>o]NB+aMؐ.Սc =w>hpccC 6FWpZ;l%)YiQxoX0NSOiԩ9Mubr>k-Z)V|E3ŌNf.dIÕV"J26:M)KY IGz*f$ Y&WM$r4IS!m\?ezuqv~9[[ݢ?y@Y=Ä`-/:8ec^0JI*'%D᥎@7#US%:f՝31VMCߨ4Sn[^2Ť9Xª @r]?ΫDؠQxºd3ЀQZ9Vr1D΂8v΂8Z΂%D#XyUpdK9 mya &x05҂B0yd #g HdK2E3ɗJy E@]v)?W)|%tzdn d\ds>^:oT:|OtHY:cDED]5)Dt8+u N:Mm~14D 5iON0}{xpIEgqi}߿gVltGǽ?Y3DT6EZ3|`P, K;W6#UѪd{ߥig1Lȏ i}_Ch6"gN_rrmt0r}' =խ OnzP >iCKdU2_xzUwzҝNg_ϫHB-ξD A HhY.̑kAtu$DŽD.KRY:8%yRj4dho09yTk%MOĪk,QIDA&RgR)9L d='RcIJ *UuɊކP) Hc:OZ1fLiI:H$K2T[e 7z}Eڄc!H,dG1Zc 1-Y˽]k`#1i7"J*" dV eXHR$ZDeCȁU7|ސ ޴Ƀ؞lP@NзO lq mp8uVM7#kg '4`ߊ8K=Y ՠzjލIBZ}y92~S:}Q 8MgиO'Ox'1+N/S+GUGsWT i-@x36r+UWx7_'7 Jq :|L}ݻ6@;0/dh.]}0)TA=QQ+@3d"qGF-dQtLDCF( : dG0pn5mï>{5 9\w,#_V"e&EQAZ!!#wʂ0LEN1uʒ?^Ϫ-'|ٲBǧf! Yl gY,FA:L)̌DW d^ w~a4܄R ւIa,cdN)lNIIMٚDcZՠV_NzYvڙ}T0h S*fHp,%"Agt6Lp%Fn3__J2i58d= nGMC(t)" *4cN8$T&46ɡ"8F+2?p[=zM)N/zv2WD|2-+`ІEJ]$,odH-\Ay·`⥮bL!cƚ8=F!䎮 J5xUU6N}=+5˷p>~z"ҴCn:- HM^(*ءif_gB z/)nwld|k:{cU ?7e;Qc;g쎩;-ռs#&K0w[6 lrs >u2Om|BnOO9t5f|iȆF6v>y=xEJc;#t22vKl>4?~޺5;&5`[i.׸ȱs_2trҰCZnmB9cUC?>d׺]ϗ@VSb#ii!X@v?.'v(]MJB$@&\DE9O.J E&A`B|y//fNnnvw%}p=ӫy}d;[2&Ҟheg>k#mhX֩`&˭b"X: z΍*8:|.ٳ+RV T9ڗՖY7]}5X,ǧm)ٮ,˗,k˲R,m'TVU4jɠ|EU20u8v{nbH+y?VaWF*Z{Cp>&ieWFBg"[cd;胶*}PHr0=š汋H)zC^wJal]a+:i׻чG߷Qe'?]^</`OGWcz:-;~-D5pQzDcB]oFDtTտ\:]koɕ+~"NN2SbK v{n)Q2[8%dAlޮu9u~ϚAr]j<1:ճs\1hP+[Ԓ&L.}]VyTHƞw׌;Nw]{[#=ԭvtiVbJkUCB\v7;)[v;jznLi7gfs+^;p(+ԶM#`ltG4Fy y8pt~2#厛٤뵓M$9=}ZJ%ZbJAJ]-2ͳDIÓոM+熦\2_yq||,=z-k/74o+Y:ﵹ*QOy^Mв5VQ:iT4QT\{;)w&b"Yr !IJ2OhbԩVrɜ2Ѳ ?ƽnj9>؇JrVT }bƛ2Ceȵ*9ᚃ€IkNֶؒZ Ku_GNwgTST)G!qάʢLJ.\Xs%!=|i hLf*fp̪5G0:URk!0c5گ0aGs8JW:%O\2-}ŵVZxä$2 x 0&X\L7Jːku&_mPN _0W*s{ @l ;FFg^ڧmqbF1 td9U l oCW K"n{ LQ\`/#)cV+ eKc- i]Tp ka:pi*^vFiPI  W&P@"1QTcpm̭ʤ7)N8P֦h@ C0VjpUfg19UL. *irY-R[5JPeh%Gs( JK[0BDJSElKc+ [NUgd x*W@P4WL%Xw@68I uM1G*[ۢVct Ne &;*|Aޠ\I1#~h0.`Ɛ)y2j0- <<,ZG UqSU5J$Xq޹TEnQX(褱ta0wT$UMSciT66@% ic5Ne F$= 栈$VJ!(ފ6Ob5c`q LseĸIcj V5Y,K @+dr@OUH=+enT&A4-3l5 %EKc[j,gXwAګQK<#QIbY"j90gʫ @߻ w3j0,[^z mk-x 49ac[-ktZm@-$| x K8,m8@fZ Nz-(] &WJ'gɤ*=h`"&!yA3 vXm!TSc!1 @1Ad"e+f䃕EJP*K":t`2#&Pʨ%ॻM,TG~G_żӬ(ap5SRe%)8 dxƒݻ89c'!N‚.&jH4} ^{@#@ xvm^O_΁6Db6y->0B(5t1 lcBQBGmP D;#:P^ ϠB.ds۝2Vis^7: hܱXK4iQihn E \"NSAq-F|e)ev1PDhJczF]_19EPC'kLiFO 54VӨV gFGyAע,%@&}*P WMI~\,Y#9|5(Ws Z_z}S,QqpJe 7`93#0`Y i/jVAυ4S#Oj|d8MgZ6M}6VŪRDL0K1!:FflR" K f LB9 5HxX B T Gqm.r#o0ri<3\!b[9>+=Vo|Ḧ́G,Xpu"4o$ao6'X>@t/<HC+2`ad?L VǞ DV*3LOn7/xQ9]r6(#R0D>3S3#%9V-sWZG>8|ч団wgst5|cr4lU!\, agQzQRy\.ExN){(3T-DV-d=?Nqxx~>?oC)PrЇپ}B/sʆqfUE, MQ 3BFd2&F?-|u5D:* IZzXI/bieQrR!mk1l g1ǎy>]ݚn%|zGqk: ؁#v*x_pEt_3O2-!JI3C :TQx;:ɣ3%MJCNOY\\z&#Gy %PD7>'&Mj11g-]F݋Vvu׫lėZ8MOO7Vl_8Y)6sț1,qo+J u^9a&Ӎ&|&X'}}|!զ i [Xrq~ml#BjCzu|;\jxwxm gwʳ<`.wmw YvFLzϳ7Fesv4 KRG?\,iؗҰai<ϿڔegU],!/~;ʋa O Q[|1Dr}o{E8^zVgݔu4;^"sW?WcώhWJ0SYPB|I'u7ŭ\/4c #^\y}5-EfqY[W}}m\|8:P[WIB\^s./^~4߿>ǿ!?sHδ5lw'ey|Y%smغ~/ᜎq`eϯ\,ޝo6`t^ \5mxvvIm6VY V#>89ȹbHogZwˣh.:zS]|5?"]Ta,GW3G<\/UFՎ:Ϩm˿c` o؄QnnaэJ?Ύkdx4^ichd8&7y3Ydl'N$|ܳ 9K dD6t8r O6+fv\8fX!=у=s@:A'^C*&;GWd_P/s]Oo>){U{/,\r8tʃHblU3J%?MLx\UwX`Ga_eCedP7xwطtLHh+\ř_:Gn` L' UxH>l;m$L6 jG1&6ߑOTpT\ u5Kݯ9g"KiU:ZVI)Y1 v(LLǀ x*u/kV,֝/AC74b=8 zi[rͪ5~1r]Le1U2uL鲩iL6|g ++ Wn'[u }Vo>)c'RoLKܾL ziA/%C0/L vT+Y -OЍRZ/k0QMVZO%Vݸs]@QyVNhJ܋uL2%Keo('1p3S)UW2w("ӫQ@ii$`[zUXbBd"g ^Rᰇ` cfY1JR݅|0r y20#9%ot9wJ)[ -\,CRLV^_I}l2khߖced$_\ƄNӬjqSN 4Iu M]tBEL?G\WPYx{q"6+MgrŽ'Ptx:%x~_[k6,OIޥ)u ڤ5݅-`B]3:}W(fibkFCX ШV]^}nN*Hn Lt2;?j{H/DB4W!(Rm""zmY,̃Å>֐]𪷘*]UC/~{)ZZJwh` ,M(b53"RNZc\k=4eFr}Wes~rks،>5hgiྲQ~=!.r47 >(ǺW߿7њpN]*k%a)|70 \XܹSi== >.kY7W"-Ez%8֋=v|zwx4FU;޴p6s3` mIP,zm-vIf9;lYnd!ݩ/cSYh*^pJXX=  J&)`E+29H` 61j> ncWuͰCd0V۬ _,v䌈yX4䂃M,sa%gT]5w+.jݹӋ.` fih=xW``O%p*?afW]9P^CYGSWۏ_IOiFU|^cvF+K=PјH(i"d:J{S2=]j, 5cܯ*ta5Uʺ.|T]QXk]4̆:to9φykyQ3$CJZ Fp Am\ [}Ms=㬨Mt(t2 1'aU.Aډ'sQ\q1EkW]jm`7IRiT !ś J %f1c|̲saܯ; ^ b*P4b5U#QFDsK--`qr먷X+вT)bg@5ucEX;kg ^8mbդdW*Ezi&R{U2`l!&oBΉ&3cL4FIԋEV]nTXy(xzn Vn.($-0Т>OA MBJA9R nmVC$h1wt\xΕU)I : p#= ) 1& zJ*2#$]׀WӀ՝ ^>!!%(qާHf\T@?Gs$Zj6[7 gՈA{j_Z{U3~]}9 { e0R7S7~qJj9wb~/ǹ(n楪MˋѴ_/?eϸdXO {3r89n~EtjGA\P$x;o~\mK$-B|<٤R- ]Li~dSHlRk4xfX+8Rd*)r?JJbw@GzF{C,E{%ͤV8Zbnz7k7[x@졘 [}Ե=+o5۟Iw5}y~_˿F̼fhړ5woounuzLZo91vD`tЃY{#8qw/^|{byӣk5_ ~L?Uw8'=36U]?~WݦnRUto°Dn:zdzWWeլ#5bPۋf¬d G'Gw~h? J("t"]S 1yQ'EGQ'-Dy| fĸ$xV95/w%gn6#ӼUswTGoKT4_ǹCvTM9nJ;~fo& e\m[6_jG~m|<~Z*ԝ[ 7GM.4[&]@s9blD r ӯ"rc_zzzzzzRo#6Ro#6Ro#6Ro#6Ro#6$dFm 8 62o#62o#62onSL*ɕDmDmdFmdFmd2L%$ȼ2o#62o#62o#62o;zkۈDۈDȼȼȼȼl4ȼȼȼȼȼZyy5"2o#62o#6EdFmdFmdFmd|F!6;"6;"62o#62o#62o*(ל= 7cuހ*lu.Wp/L =w3;۱/_/rVG)Jz=T3ւ0ΌkXZ2u.a ٖlRYЖ)1%c@NOPE SdY;rri(Ql_{&t$.JQM44(#LVX?# ~7ن7%yiKT6 W ߛv ӦaF|L'gښ۸_7 Nas6gkMdPTi %--JTl t7|fv7߬r N(逗U^n꛶տ_z5xvY`u: ڂV"E7=Y=)ö+8 k~E_[V q+evY_;ZQ,;H96 k3@*؍HVqDdPH_7̊Ho㢘|BRuNfǓqu6gA. ui,Ε\57a_o!X,͵;hAMU1@ . I$mױ#ұYZTw::\^1ޞ%6Ug;#Iݟ1dۅNd٭*4T h Пu7Tşy] ,ETJYk-]ș&)VKԺvX&bnmim1 S*T Ià$%J`322I*fu sWTgNIRkTԕc)T ٥ROIupHG}t6tuX(vkO~^ %DctQ tV)K&J̅7Hdd?nsFCiPZq hei3[ M |iu2. A* ^#1{\0/eJB `v!Ǜ-#;N齳5=y `w`Y )ʂ\J >e!?f'Acy/< G%Gy-UƵIDGɹcKъrJ7Nv&L~2!SEVAYI\X{*R^tI#YJ2k[w;"4Yh+yoh*Y(cr (,7+R '| *N-Še̢ҽzt+0VX3 yvh֭v<*[af:5&ղtۧ e;<"cq0|k?vru>iWq]WWBJ#/Ώ8R(J'YeCYR %G(^FCKႶ 7nڔ %(q5>r|!x\f䐉kHSamgý=rݠgWMD$6M>n@V7oms7! ďm+1^_t^ދ>tu d:o܊I;ȝCK .WS(@hwץR덆Ab.okZWԺyҺZ=M績U7wf!eJ;=_7b%Z0Lv]?>2?Ju_=p̭A:W|e}nMu 9ͧq|m?0dDq>omתɭϗ@3vK1i*卩PeA˞˞;wGݳAV;2-@b2Ah#袴YdRKZ(\S9ަ'":}2 /A}a+j4C7a| .mpD]l'tYF& ehLS`DB˩"nBGu.7snI*p|. ٳ+V$VRYׇʝ ~ ވ=Y.渙S,v^3Gb^a:g> M_j N^cB`K41N0}29#Hs]IC{Rspެl0Q+h:asYaqI pS=o>onca{arN6Ie za0p[{_,>$r~}=͙ h(H@ g,,pFȳ '-A,BcKd0kt!pōYA١KMҐƲ#FE;Al 9i*hd%Ț.pV 0ѱal8{$ B`e螓N"^ք=|&gͽ5Z冀}D5<˪T! |e3tY,eJ:@ ;0.'yp&DA9Zc O7g2l5=yvǔ5Id2'kU6k JF qDFH%COqTV1'l9ֳ '0/T &i!z2}Cɋ ! k}gZ+e - P},P5 .xQ\8gnc,% b{5#C2X%fCrtn3W;v M 23<!E`8o\jd:0jorePTx: ׷; 1ewCjxqx6ϙmťçiljagS9RFir}]$y;lRajaoi_ٛVxFJM~j py{ ֥n`geN+~$-u$yIXۼ..y>_+lezd,RDJC:\wp{+$NFqz}O esA.~lWbi=|q%ͷMwN>iuC'jy~ӵ7R?/)Oϧ^~\x1'=p|:*Jy?p>]am,􆰔 ZAeoH&&gM.k- 5 Vjq&'?oY NgC5ə]=hrXoKnqf ?L&h?|D/{ŤnqEl+115BKnם Tٺ ~ҥ \*jxHme-W~=:6=TS/ĤQaeۍ2.N 3\9 ^x 芳"1Y1!GJkBHZorQ-AF$uU_Oy~>Iu\ qׂBc \,`RиmI<$ef%v :ƃoL剷,mCmξEJP>qb u|)}Tm>VZs҂NJʃ+BrPr:YE!8{XZVAQ~*yCVHUǤPR&+QaŷaBsu^X`Hk:EwH8 y *A@0ٻ6,UIr. $`1I0F$)J!e`jdDQ%;l{ߑ"YV'WK%>PUJ[OA-y{t;+mlt~a^0 8trgBGIxʡӏKIL8B/ o5g@8O #<3 tM!҉#|"ř E 2I eWi(ڵC[Eom .43zu"FlN>zm$ɔĶH4'[ά֪y( [yd,W =[- 6p+V vo>}4c/ɬi89z_ŋڒ6[B;3`z0iFu '5jBB)`EJsN8BbFr](BJtJusΐ5E,5aadnVƈզs7F$ ^N{ͤ{< ?Хgj@db~&ib˼t' Z،px*PCzYZ2`n}vN| 9~P};#НR*b\e6J'8LbF+F1sY)D 047[rN~*]wE~43?I4$4f'L9 ^ X5cxND<;j)]+Ѥ*ʵ>~0+]/fIBY r.] t5Stn-SX.4$j;[ݷq/:.Y$DZ^nͫnPͲ&jl:- мfNΦ]8-Ze⢋u妦DONW~O‚ k(֒hReZ4&ElxZDtFpB^|<Fl"ChΧuE8y!9sҩskVT*[[̵AfD =:e0|8%GOV eλnO-&) ''ۧCևL^+$K3 h v?dc-㗈oYpHzjqP/߱]>Rav fL>Bؒp>(MU o]<3^' J>C0xC pb[j!>PYkk1rwHX`ARJv vk٤䧧 V צVLl,{,?3L̖`h'⫕ n*P8}ycK`5V'2R 6mQ'&Oxy >8G $@~ǹgk:3M$:!BDq<a^s$R&J.HΗFCBIAQ*+1#r!X`h@"o NP'.tH4KP9ܷt_Kӽ]˕L|Yf2-.wǞ\U;sfĈ{Q nj-B'/5M9!9R48kj{UqE.KM h(It-:PYù!@OZ{1cGPD8bYj.,KQR*˥RNZX`Yro ȢQj3B? pRM[up" hon:"ڲy\16'a ~5P~w t gqZoI_*C7]WfH:/jJrSa٩ͫ{]հy ׺5/̨'mӓ f n(.Lp%hI6P<lNBǮ]^[m5\̑G0xdNe"3{K[O7]Іei4NpC^ W{_ګ/tEsG]viTW/ǧWB;̀[mVwѐ=/Mi]/m8"kͳ6WִY^2cʛwg7 nzMMxtx[.vAB6#?Yo'c7D>#yaD0RI,#&t2dhbѳ1Gǣm;`Zdߨ}k56xzV'q#e`Eɴ]H~$ٽ9YW~"Ioy&}dq#"ǯ~ۯ~xʟ~?w? __wuc$p7`]bg'^d )bs MxJd4k}Lٍ̓+^@dGҫTJ. Z.օ3;ƣ3gWr?n4x֠7+|&#PѠ.3FCd h!hռ;T&| 'fh~ԜM:%$W?{ mͻz*ۡ>ܠϻg7Wy`+e!wep Od,8ƵY7b䊃Kser~\;\RIhԜᤙ}EQNHy_i;G;>y?99ssw4t} >s?_}zMVwq/Xޯ`yy3>,<(9T=,+׫4tIz9 ;pr6{!n|#۽[f.j}Z@ ?nd$Y{<7 ЎI2oYԾ)V J zuKt{-E$9)dV#9-m๸s&Ht6j DVh=VY}b eHч8r:%g2'}IT&]Xm:[UdSp7,HFq`VxҾ-$&.YB\t,2Bf*"0D 8 *[I ҿzAAGK%JfceF6Ö`)N@e?ؗjtStUT7O/ Ğ9׎DhFذï?ry'cd"ힶ?^.o3JF;XfH $kϐTBO$P sn%'INGBmMN)[b qAV]c97J0ϹIHՖզsdUj,edgte[\܀ww.fq>o֏'l'Rf+se2ye%gQYI' 0Ib+!`2!ER=h(L&mǬ+N爙EckԚ;L҆/Ru*ԆAj6'fQhה$sxRcN\J٘=wYVҐ`ጸ>Ū0+m| Vq !bMJ$R$dQ 6jX2)"h,Mny.E0DFJDUY"A"V ҝѥBK.l=:<.1D>T^{+H#7+.JDg9cרJ\$Cs(r&-F͍D#YC,n%b-Ϗ:\K'_g5*U.rQrqgL0F̝PIhE ˁ ̃ZG r1jԱ<4M>[WKY;UUս DEEE Iя*iU/IxGQ{ؒYZP)!Hsa!8;>0KRj1[ř%@fddZI ֠t ,24ut%n˫xoGWEñ*yx5!2=x%ikZӹ@/>5X]4zUٸ|{ J:q$3Ú DQ|'rѫ^0TM3'«?\ë1w͸ٮT+e/uU߯}^<.`}0Ӎrr|˞+2EK_eՂ'Pʆ(01 ? Fq._DP9Bf1)$ ݝ1`Bc{(JխYv%؈Ef)Op<&rBA Da{Tу0RsMܻDƓ Z;MEZ *d 6Y`>`YbF712Q3RS,JicQїlɺ|=5훹d:om8ʬuI޵5q#7f,/:ٜn6:{\S$ád+)J")ZZ4ؒ9n|ЗdPRx}^5ٙZV_ضϫ2W.?KR,3N'aZ[*U@ob|h{Su[mmWpN,uo'׿ҷ*c 7V:v۵f)Y,vpm?,. gl*hk6JzawN$K>8"7?^n&q - W_w2#==c @b0ԥ8Wr|w߄qd:;zߣ69J4F=XŻ;$uS^xSoDg2TBo7^[0zL&A O_.Ǭܹ׮7,kHەJS uR $KYɸnqQK╧i[~Œ5!M&4"YF,)̰6E?f⒟Ծ>7z~\1H=&I'Ok_p_nܴ:Kee*^0RL bVrmYL,tK58TАrhCDW<:rI1 !0@~y Xv1 };#!A'ʹLl^$Ku.(WGZlYׁޫa)n.Y  ʚIi2J&){;"Eq,uYcZ|>3\Yӽ=Flѩ,ϬRJ7̕`C(}*!wSVw=:r>S(&G 1$#SKUbed2hCQE"uDbr"vϠ\ǑF춶ЮWw۰H두tG@Z~ֆȏRQUt;]&,dXyUS),{<@B4EYr :;.@SG0xtNah!`G/'S6-ع ;-J}FB~l/O˃"B_$xKH5[k2KsQy<﶐ FUX%)ϘŠD_mŏX]l8bCYtL0ȷ5W/B>"Ge4 RsY?/s}2Yd.@n4DZHQP:7)2 # ۙV^N2g$-C `W>+4 ] FqiuƦnK@ xIwa,Y /\3ibY;\v)YgLHďwg gO=kgfNv *]>6dXm:#('dbRY ]B>3#I2U]sPa;ٔ  eɥ5"sJY`3kZA8'=.~;}T(A`BPVd-DNbdYd99 *<#tc84K0ig2ql_G,12]d!`4{ą%72) A1^ u#Bc%qYыF*XP1&}#"p(@y·R  Y&,*ݫGwcULE;:jfjdzΪ닺,FKSiR-+Ͽ}  QP 8x&__I*$bUI]y%u~tURGQI=̂M.{BȒ' O. 8B4z"^ evs֦)EqܧDO\dm2#L\Ct k;ΞUSx>j"* @'h NvI|k{y}WUl_t^ދ>tu d:o܊I;ȝCK .WS(@hwץR덆Ab.okZWԺyҺZ=M績U7wf!eJ;=_7\=-dG˫3#qޏ]5_zOǭ聣fn=ҹ/Kn7wKmmfl>=ek+!̭wu!'|ڮU[/Cf퀻#2b5TSy+"XR˂=w?w=w 3v(.g(w&dZddFEiȤQ,sM5NEt/pcd:^Vl?4in\/ڬl糏}42&AiO2貳L>k#mhX)e}t"TpU:97$u>+[+F)UCΆsoY@ }[Wמ~qs)Tr;#1Kw3xmhdϵG@'LR!%'>jy x$T`K$ӡ=98oV6j$ CB݂iQ^8) ωQ7771n0нԏ0l9n$n=0lzj/MJ9?CWqqfUIFv$3X #Fل % K2 N5:Ƈ, ^XХXl iHcȋ" v}yDT6eK4xV ZZdMz \Q L0t6=g[ !{ tv'J/kB>FA->~_nyeUφ}^ЙiRE2% E<8BLhWB13coz<;cʁ$2ÃZC*DTIcV# 8Q"#!O'tby \8*rv RoGO^Y@*Vx4=>GgŌﵾ3X~ބ[tv_p(`A}( \?ȎRF(phs.ED371kc1 =ךx!D``ƍ҂p[ ,璇Bbd{3f!B:{O7 w+ĝj@gqąS`qdvf"\i07.5|2u5 7k~92(*}Z(R?5 b|veK+p+insq|L[:Qft^~Nq|w>_}w$?bLWŝ懋9a矖M~JQQ-j kc(է7|Dl`Xŏ *[ܿxG2!47>lr1]h9VpiV79|WyH:UwR?IdAMEr~swЍ]_w.zQ͠~d2FCGWL ^dl箬X]^byS|!v @;p'] 2၇fX;2A{#_ɪ3j)ocA5BLڬj.>[vYJ(bzP;s˕`g8+ Y;.3ErT1$.d((e,dԽOR7[$\/w-..8E&I F0LRF`VҩhN^Ϫ љl<(T_x6t V=[DeOc)qy,oPiTXi}6gK :)+Tg Ad%}p}ciZMEYEGZ!V!zBJ@Gv9 myaтo y% (T@O2$nJĺwUJD|f~TF+5ԚQ/nnFy%$$jsNrjUֲ]eWb\T#)bGYk3U"nE__S-׻ldt8ђ(ܒgJYcG&+_yiKNA9*vJBF.}H?49F@y\Lec>]P~Gq)J'*`Q5Y&DzCۂe8Ԁʽ}hNsb|۬3-*"Y\/V3ۊXUNCK*p E%PHn4Yދy n8X%s:}1J/o Ҕxou(JS*Mly=UU3.:T̸qH` e!\|C,@dbRLdRhDލ.e9eZlԢsnv#lTIBn&֔w^An.3f.!p#b_޾fdj=~ p]TS1(WYnizw uqyz^GZ& MOGTB<%~;"weyDI+mkzyck_nXn1uGS}&|^jӽD]`cv{[@&:[/#~QӾvoȻu,:o Y;mK mgLdG;?tiHjlgy'-9Xmak~s̗Bzn.^*Ӻkw~='\q;SW5NYWHϺ##@QA\Hέ~`.g; L>N k~]}\?_yhLyd3j3"FMͿᄍbC,jl$R$gU:YhX[!6)S=xG'HώkCOm8/?,ϭ$4Y+jhn*EakkQl_wH0]wP&I"U%&1ZJNAγŚ~ڢ(~|an5d쯷|V&[Z_ 5}/Xk,T2&#j DߴmEKKDN'fu|moל/ƜK2KS1=̚m)r$U#beR-|&c34qvccS6FG2eUCl9(w{pGk83E2?ZfY1t*`oo(K›Lw(#F0U@ȿ#`w!\1ш*RZ ؼwxfYa uA裃=La`tG~+o#dm2[\=nΚ{TeUMwQ,9UJVƠsu~A|nu>Òx~vYQ~˻jIB%ڛJU0 `BZV2gmd#/ir)ՐR<PU,yKL d-WWqxUYcOih^6gH/Ag!ՔK468ݤY_b +b'EI5V~M/:ZfB wV |V?醼0Q/*9c\X E)֘V XT`t[w \ q^|7g<l(*u&JB~9|]r%,d,C΍@[Mr3Ag\ZB4T&csɴL%@Uಭ!`oԬ=Ҳ> Ki)_*`&rȁ}iEj"s#H(4a[5pn\jਗ਼XRY d P{Y4p..юJc .mL+Zֈt!fP57Ps*L f AXf-@ ^%J=41 q B +T{c1* T7!3شήZ@*7!S1D\ ̑ YRHl N }B7W%egZx5{QR}]sqҏ fyUVo*Bb|)1H :RD@BYZ,kvM4z2T+ -s N#D:387}re(&**Fɂu(Lz;C8i !L06s;vvG1z~4JF#=ۖWkը2hp"P ǀWP\#XY)yP, KW40w<@d!hQc!9<% 4I"eLBȼ&#  .Ks, Hx _2XnměAq2q/]g,T_ɂNU~d}B᫈;Vm+dəjo|5&}sߴ-$dJ؇uh{%}}V{@#AxA^RgʩE B$*st]/oi#LKqES v1a^4Kpы@vYܱ-tT(dξ$9cyQȂjrio/5a.@/ʀG⪯[?o,,  3EHBir9 (=D;䍃""Ǧ@b5By( !DA>&~H519K|j%s?z.:9(Hh@YNucH@)BMsz+R j oQ[A QW ~ {TR Y\7cs^yM?|?uj_\w2g57P7nl8 4lmtFî[LSEHu<5W dZ%DmZsHYM<8FȘLyt~684#61pȀ5aHLz( +苻"EY.KeT`=@C:޺Je29&di|opꡮ1{3E^e*O[W TK\ (\uB߁] > U C w((CcDi)Ub.Fb糞w:P%Z`pOftjɅ0Q`48\ ڌMZ߁dhEXAZsнCk!2s\5gBr pmx=?jeנB+Pr7޸0 ڊbT98ycX)ʙ9 X&[ -XV RLRN F:L@=8Q/=hqC`J-*5Z-MDW-\,'lWd|$CC$]:)`JD LB 9ϵ|:+yG%8DH&@_B# dDQrh2Lb|~󝁤d"~@}'JZ DݜEZrzv\/ g'Uw7?:+I0.+/ 6mVGn-w>&3|F3oʙ Zf `-G_p+/W?NU_'Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\/ZpTpZv }3W4}U֚Qp\ꍂQp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(c1@bT"ٙH&Al޳G%ɐ<ϭ&٢HId.!P,v=Uun%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%}p hW:nΐBҘ.?t W_#("E"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%U"\%ս(1(?4GC2txvW$HDWpWpWpWpWpWpWpWpWpWpWpWpWpWpWpWpWpWpWpWpWpWpWpWp)&V$oa<{]BPn>~7d.*Js1jǵ D^>JM6vGQg?'T2+f&^|io*FhA[C̖wNczwim.hx<G}4 @׷?Bk'Y߫dPnNT'5]X좹+{RΓoz%&0 +ylOOZ$A/Vc) )hS@iٯQrqunoݫ)$q4\gQq4e 8*)i?W;ٷ FDtMe/*˿G<}sCq[ާJJ *'Iv H=5`>N_DzI' Jӥ/sg!lTdFZk-G&vFZ~܆0|(ʹҌ@YUU զa*NT%j~.7A?6u&7Ա"5]Cbm*Ub3HeVfozg{I6q͊;l:;tY2\+JTj.7Sjd46=^f;|!ݵÛJ㔬8m@Ĥc3@Ǜ[]Z'X^Owu>o&wmpW.kx6u[.d3<T7VZ6D@&,4Im`IC^:Ḏ. 5h~Qơ7NgEP6EtX EZ%fΔxmo.uE׮֪jI5:Kk>Xim48PNmKM_0`": U4jKel@Zg.~*>GTQY7IxDTwQ16u7u|{Y(^!L4_wX6Etj%c-O+ ;]na}V[m0BH*ȣ9[$z,aw}p0BPHsɏLn:I>SB&lO/]0t.Rle(iUZc7N[ 2i`8cEEa0 pͤp'$"[>ree/;h;l\Sgf'K:o'qB=NJ%>Ɛ~') AbbCHKmTٳ{Yڧ=3->ws1dWCq[(W?n,՝Ѳ|8NSxf]T$I7I蘿JǂwRS-XRG좼Ihv^-QTPɖ+H I%p0jW&fU%)ȖVJB= [iPں#ol9KT2UeWvF(T͉!PGd!PJv=zN>>ׇ*mKAWPX65TyOp^* RIuauylfb^˛^F#W}H {]0R)ϑ4VFC2UrU$hȚ<0o u&`ѫN/-_+)pX9_&WgR~rN5>m$=i%AoG{iR\*&өՠZ|, Mh>k磓 AӞ!#tJ0es%BSxjD;_ %xE/1~Dr?k2󩍛gj[<ۜ|ZN5Iއ,?lջ]+5+SffrVpȢC׺gtfrFuzտC1 Lwm݇}=n Pe@vӑʗvu8PTxk #~jכ+C.tIe#XWk.Z ʖks~>+:]TkuXk@\sɭ`)$d3//ćZ30jTn-cz`.W!Ķڧ­#~7" Yi13:8N VW*]g^a0{ 4R C!fHa66HF_jRz\Kv)u֜ҪC:ʈDd7ʢ^n\[(y0^$Sm|&݊^]Q-ƹ6@~?\ \ A1EG):"b{ x_Pp #( =?tp,8Ghk b@}aDYCCŸRQ]Yp%+p3 䱏G"k/5qh#%;!s#G5g{p>q2}HA3(߰`;|>Nն6˅sV8֗ˇi_滭K ]IDVz `l (/id do} M ׫}M,98עk(gM|g7 ?>g0H^d7^"g2t0JqH(>J١=G^Nٖ*΀ {Y- D$Yg(1F9>H1%>gFƈsL(QmG6aQB0A[R j-B"Lj5g{:(ZwX۷|O-xK:nҴ6.LVk03:q:ȇi@6XlQV)L&yORs-є`2NvQ*r:;$cssZ?}a0j_5:]uRv%aτ]&Du]5DşQS`*a@|ΤR5(q*P1P ‡lWAAf[BRAY#5B"#$(xmb>v): iLu+$6!`BQ zZ„gkY1igi ,#ǖ5|mm?}%0XEgmKxX:IgQd荴T"=k ,1Ak LPyCYg*yv`"S 3 wZI%XD0`AP-@2DvR@qðN/ԼpmmNh0s" {%&gQ/iCMMm @pX0`S `9F6^S% J`E K_Bb<RO a$:0 6'-* #&q9\! 2>;D%&-M@u˃"y3b!~JIIÇ٫WK D0.ŕqp0 Ioz'e_09س7ORIR&B-j6"6r44fdGX2`)"Gctt. t.tN( t^?_ TNpp`! H ˥CR ͅ-?Y?=ZU:xk| ~|䳞.(9Z 0GMoAu %k0YBT_ͮ~;^kgL{Ei38[X4|A[cX Ur ;ç 51P51MՐPCa8*}`8O._gu፵t{NnuS_%f %0X1a4) _MY\ >G8U{7 P:*?g?u/^>E?zٳ/0Qgߟ:ҏ65;7;5;4F}K+rUC}b-V^Z5K+e-R)}r?O]i4OU>MLe+l0*)nMQ]ѕ 6R .}bM \gA%Ϻ#]&vI~wW9lЎj0RBk. +εT ٻ֦8W#-#r'G GfU],wg b@`D7=5ݙY'A=R$(IrζI6j~i/:CuvrM<}f˭Ww<}nyVͭ;?uEBʷ|R!>?<-Il.'WRn@SM`L@'&vic1 8ytWG]Fwtn H&!)|k&+ l4q %I64;wC }= uNmH~iytNOn!]~6. Xϛ'9>Nwv>`glጆ {=THp'O{'VԹ9^}a\2X_rUo~?(F!R|(B1ЊgPE}5#fu#ֆ+)Mo䒶l颙(ϲ?@]|z'=OtN*^\,U߽^$Ϟqp`C5ῶᗪW߿NM̭ v7%o~^8|e/lf?>]9>djI_-]; ?CGz48z.6Q=?sXNO9/ ɔH)'W{,!:yVFļsC4[[*bmeCsK% k(IDbhAg"zշS 4-5&cb }]Hڱuc= H2Gt@: eƱ-l5Xd\h&>x -{}WUȕ0ƶs9TepרxUUX*kzܹ׏Nw9dlr|sP89>]pn>zݾ6yZKyKJ=<\M{ǟjШ|w[h.kԢ`F@͹5s.|dkm917c!L̩ژMxܳ!4H%T$5vZ065 R#aapel XXbD|xU鲜J|NhGl[vBF:b zS#iBeCJװe[%>JO:㙝w=l9BdMMs;`lE))n<5#6K)DaֱmԆQ=#,^}5_ ^sRkM5VĝK4X$'kk[ `A3dEz6EJWŔ8J߰mk M% Æsw<\%k&w%"m#bxk&jb'a[&AdA(Z#}W_.LV a7>T#sw;>b5.SŃZffm1Pj."Ict#OX6,6 fP ibr$ARo .55n)pZFt[PaC6,7a02]#k^0_ fƫWS &=䵈 ?ϕz欸50c^r"Yk|X~ }.|*7Vj@1 Y'X )q.ԜOp }j!q,a1$:!Rmִ*FM\U\(sp84!pGkbb 1"ѱ!pf53:l*mq>.ElbZ[LN{G'T/>n_Lr?_ˣ k|H? J7boKW|ˋkzyGś䳔Śp^Xӹa1~uv{:җ.= [M۾דRB;b8gǗbDLP'+=17+C[t=}Wgěe/,ȧ\%ԧ錯PGK]DW˗>/V燽@Ov_~/o }nIm7 XxixE oAKIo M 7ꍜ[w B_ r+-hx1GK_$F]Tri^A>IN-q:Q׋Ьc봧{z31wzA~ ǵ6ȫl}uWwYVzmjGLxkڤ0t%Mw K KK(;2]6[HTvA-E1\nɥi8Q \jrd Mh0&GޚjHߢ \܀eI`bCFF"GsjD TjS2J?r[Վm>k{D DeI`])4MblJ!HI X'~c9Mgk?2=9n@[:7̫{hY[/#t`ЪjhBffb5 #ϫz7z%06R*tZgn9ǾkyP7JlKtFmW%cm<@k@{M\\5M݄J- u9ۂ(k:gJd_+ՔQglL:Кs Æ}Bٕ4,+1QͷwQGcB!ek ivU- @ҙ\l}+9 TCջڛ DPX<¿.^1i >z Үlf L t_r^<30fL{eBE7Mjaod\ qL{0NHM Ry@jH~1\3cI ˆ5\}-P4٨j23jhc63DU㝓&4^'a#!-?:'5(0 k yjN S(HW3z̖䠮xOe 3a՝ꚝ8Zi S  R>?ټ9r$7p2n2(aFxb-KDPAx(\|$#ںuՕ>3B]{ Ǿ>p BifBׄ\latN({XL&jnjFr ]Ci%ܖnakC!T>&a3H65!1*}m؅bA=dSc|P#Hy-gwlͷ9y&Uaj ꠭/⃴8%u M0k&_'@hwm닉e[_$ajkv,_o=<`krcHLTbN Jzq Kf4KNmc_Y7l8УKnH;/{MG!KJ];|kǫrqedQh/f=jc0V 8)yőP*5)H^\YHz2l~8^2m;ߚ@j-e[Ki[foR8 idf =ƹx0 ;ϚUe?,Zhf2No> HuClH&Uw_\ ~ײ|r"2Jȓu%OA&ΙY ݎK۝w;$vB \b3$7 ܼ"5^{fdEBPIr6wC2$g\W-(|}QS|Z15[4npaܳ t-vTn@b{:ъOv@+w\?z=.6I/O'~ꃽ5ٻϧ;.]koF+Dl E^ȦEb?oW3YיA\52Te~Kƕ4o\k Lͺuul!M~5}^9sAcoY {Őyq6Z9EsNW1uxy}hݟ-pw .\ *rȵ&Xo(YN;#N;~=+NG=!^p6d R+G!ʩ6;GT뢻NW߷0cp4}*qecsH]CWa󹹘ND{U,M5A4VР#T:)G E-3ҩȵv/p:UpQn߳KΩѤNAz%#w 40cq"^, 1Ry0\@ȶfF"a#(lih49D'Š`{e, ̰6R&RyLڡ .Mt!XFs1v@{&CD7)@%KI h-NId rg%0PaeZupi94@F"iH?6FÓJ)nZ^dSQK3(㞿{V<8"'5p2k(JN3Be Vh;ӎ'S=KUOy9#ҁ6DP'[)973d DI(! ~>t͓q4OEAT qK=!Lx̂T5(j [A!Pੳ/W`5;oMg/֫b̮? V;]^l /ēW™hո 5x:D ,'ۂE1&?pͤ4~78] ܹu0x?5 L&3A8"dIiRSL7rz:gS.-Y2fэ>Ds_P%K&6f5K?YnjJ~1SlISMS[6aeS:`f>9D0!ђe~,g,(_äSW,U;__S ٳj,|>T2?,4ZPS^&e},#>{ qTpS)a yl\'Lm?/I 9< |3z/_65nQ?~MI5ܤ>M.߽x>IN.9ه${ſyٴa.^CqteGZ?JT6~:)GP!W4^&λeZ_d:)Fy쯤8+-]_qg0FPwEVd^$JV7m)]KƦ8}7^@q6_ֺ~=dO=6U^|~r+: y*TvWSa-'ʋKBUWQ;XAkX_I ocB5tT^9y5}XYy3*I7Ps p³΅.+za#W($4rZ{o{" s S)T0e.Y pCl٪/&I|{6I-̵`\( QYV+Pp# BE,t D`Ȩb`kEIW1!Z9޾s{۹mhzķt)~Rwab ! ]I_cOٓì0o_quf?/ {ϳL6_ ܸ]tx5ipg9dp&W۪17f}äv,G-x} Pq*%LRH4+zקD+׬6oꮜiuD紐e+/ޠU[lTo&A1#( {m62 D` {s9hx|U?m޾zZ(ڜ-rm|0;˜N}pAN1X>qeipP nLQ>w▱?'["WmDq]ȟbTuG9M+lMBrLpJNܭf]^AƱ6$Qs-9˝J&Y) rmF78Ŝ9IQ^s f ?}dz;.:0T \(G}3d@Mٓ/*aJ<BS0/(Vm2}؎FuirƬaj9Qru#-r1(B)f%Q:KXx#yhWϓh=6XlQV)L&EüLQ)as-є`N2u' mE뙢SBu$|3aw)5;mV;Z[:@o$l rJM9tϙT*&D#[Tj]A~ W6wq o EJeK*DFp I!Ã#QL)ڤSdtv="a &LX,J; 3<< Gg-c֪&l7w0դy ZiɁGe":$k:K,:M@oIe"&V ODQ߈ N;Xe Ȉ Z# yR uxАkvm_Oy $̃@4:X,fEwZI%HD0 AP$@2q;+܋U cTpֵ]Nh09QF0,y$[Fy kSQ=ҒPM\7kiZ+, jÂ)&Y(! "$ B wԮ mmXowb쬒0p!t\r0V+M7!^TmivwTRMJwg-"ɧb6NIdelb$I!R$ ȸQ{(f&ol5o=U|Q(b0aFVV@Ҧ "aHAS/ (y[ T.p.:O !’wMu%tHWOˠa0zy74)pio~.g=5pr>r5'B Āt(tJEͼ4.&i8qk),!a|-+Oٕ6Ui?m_oޔ/_LN/قPESs7o L/"<5sa!$giI/.|Y1dy1/-f~|)Z`X4oZɇeŬyoq /-ե֍\V겶JdH[T0􈛂]cvΆ3z 2儮TmYo0}?>p7? wׯk/ /!TepQ`o`뤌 *1eK \uE :G,1IթN2.}tsfKbƨ}ak0;jY |8DIrBai'>˱џ=6?40_[ bZ+lh?)^_[8mU9'}]m f.|uaxx}we&}irTuv~z5̼1W&ߟ{h,WG}lT;ቘRQQ^0%TA:"o1ڦ($ Հw@<SC#w`&@TrL4kt)cMl$|Hw6{. }>׏%=t2 i%z2%rD($]T I2T}@2[{J]E+ep4Id%ݚ}r%5[ypvT~{X'cٗnvUwUTm!qzeل}NBw| ]ïru3bh6 4_*]}56YTf$ Ģ+1!v&0jꔰu8E/q:[x j3[5ΘхR)`kvޑpJX2s,S,i,<Qn2@vvp9 xw"Togolviv8}ڊ&Y%F9LV.{A%JҵDИr>Rr+b1ʦdia WO KR)]|\Uػph^.YǶQ:Gm`oxizYIVN"עm 5Ag a@UƠx}]au>Ĝi YVMZV6Gn;ULb x,T-0KDfFD9")"Nʍ)В,e,E( ֘\ۥԖ#tAGM׈H=ghJZřTRF`^˘TACn n#bCˣNʹil}觸8[kUBj2IV7AUsVEjc 9.nֱm< a. 3Ix]n>/sXU~ ȝQ, 7u?>NW!t~\˙qqww}DQj:d9];8;>1ZHz\iePLbT-R)8$!ŊC_]-GLu*e|XW"QBM }X75*.SS 甠/}k7߾lb\3~ҙ^vz\"ĀW MFNwwB F0)6Gr9I`XA^vm9!0+JI'9!2&P :N:ח0J[FelhTAE HU-̈ Τ3Ep!;n DP9 kC:RDI\0Zpz1aA=VoÅUӰJ}?̓?4px-) c^}z'aua:/aq|tr:2he]8?s  9Z p錯~xu\~\ox6WWoֿϜ50%ƕ6/=8{x,/7x{/ܰeJ_'鋳\K;r'Ək-;FMlI/N[R3VJ\ۑiP2* QL"gX"+MµU8$\Rr8/.:ʎA2.+DK)Fpu5tgVQ.֘A2 M *R Ck\9ʭjO"TP MEb2@\ďcOEm+$`NLi?ue{K܀΅&n^ۙViu"Rbpr@U=fu#S#Tcb-0-^2iUUGu%Ź@S0:,ZUZ";v+՝3>-e&yStBSm)Q%P-@)htbIZ)L\IJ3Q9}9*$KR'ZW"@ bvcoznVө 'm(b+J^bC2:V#D'ᜥvN*jk}#Y{;~s%&Cfј䌯zm+A4lLX)+I Ỹ/5QJqZeCn77g2AwQ3yۦgGsFIv.0/UTZc(OυL!PU6ɏK"dJ! u^OA MqVRe$ uΉxŖ YB.i)LL&5>jhHuުlvwdey턴 ic |s^ v]Tj،`U)832@]g[<]LjUwªkv6k8jaʧQR R=};Sx]?m?g^%Au`JZvAv#*N#ǤSEnx ].h뙭t$<}[(#޵'_0bUj5±ȂU&ሖ5o.n4o(lf6O 7x;.)YQʑB{[*d\FEK.ƚ8ٗD9ba7G'n{nAo{5hqJ(I"(-N[_:~ їQv#\ $hrc޷/֗"H +?/T]Рl)Z*VkT% J$*=9XMZĘH)2&\jsoͺnù}jG6]ڑ%V*R+evrqK2ڳjKhfĔ36l (gI'bxg%1VUK% =H *smH>p_ èjXIi1d!` @Mu +2>m;Yϖ~3pPhʦ`CgeUД|-'\JZB9)zb@}8dSU]4!QlXmOAU E"B-"J HV7OgdzҤ)oNT^AJP%3-¤Ơ${*tҭGn]8|x_6v}L|7ON-JIT#.lނʬ>Er 1%.l4y[ۇ>{|Ld[ηf&{K6dhPboRXg$6Z7Q"S*ܣ{1f![zS}YlO|wm޺CUMfW) hג")'T F0D O;Ϧh+^R%(C"N^]J\{)\ԖM6琳EԝCAy@b\Ea)y[,hJ"~陸&΁ku37a;H/l]@l<~{wbR"w&8((Y<96&FLjD•hSTcL#Yҹ&qo'}+ćn;DI) lQfJf;M˂-c2(8m%kfo \ޓ9^1z\b&{΍,Upk .R(HZ!4`TXXzgoY;{Zޱ1r9n)dw\q,?Sdi'5q6ٌd! H'ibdC1N0N?h)]bi?s_T<@ܬl4UɊfa'#X`Qrs)g9"J~"nnc"n{i-0>Nr#ɻQo&}޷RVi2*9s\%83ڪ,)Ɍ`@y(V \#ċ%.?t4G`#Fy\`\qck/K:CMٺ*Du"(;*SJOFJ|V uEn#*'=Co7JA{DE"uH5F㷵5؁WjuKT´/Ѩ|?OGӯy.L$EHhQYǐŻ`nG'yt&GɐWB<4#<,z<~P!81Lp?7ƺƵ ЮWmx p yC4@s2Lm%MkV~M>^Y^TyfQEKs6hqf=`utL跃%5 tIj>IQU{oM^4 ):|4kaǺ2g}/!NfA;aC j<mھ 6o~7t~57+l "" UbbC i]wQ |Ls~˜zvz&? ׋.սӷ'FiͷZwۓ/mZ=z׎Z>tcD߿i|s1_7a$? 뺜98ջՋ?,cW\>zՆ Y~:ϳR/!Xյ!äxհ5B7t|5ٸUj jy}/Y !Ύ#-vs1л]MTTh{?Pfk)tCHw*i}7Oߗ&_ړ~m^dlH]Y vǹlTY ҥܸTj >Hme:(JVL:x+-ԤQp}`e;NlL%T!;9 unm?xw@W+&v\8f<#Sȓ:p!$KZo]Jj$㚤~glMI\ OB"'SGŔW%t V&+#RQ) hzT)4ƝRL|TK#/|Xξ(M'b[b 4Tjcj/v;vž zLi#e\|/O?VѓEO^_UAWpb饨=pyLs'>zT?Q~(tsMhI C"-+&t^}EnUZA3'WkcyrM 䪠54XTQ*"bJk:I 厷9v֋R0Dbh9b_Z],'Y`њAxU\6zjP`;CtTݴ"Ⱦَ?k~r`mzvf#k}ٚs5)v%#Ȕ)hD&ze$dt1.pC9[ơ篮m~VnI;oG38gBO?Vg˔YTf2 /H-'duN,k6Xbb ٠&d8ԉ8T), xGbrp`Q4%@CY8Nat Ne LKa;&AwL1h0`SN !SX "!YC3uR_7E&Gy稈5Faxf#WC7v aHee|up`gZ9 @YU0~=Xgp"L+9dު,<ՌQ.`pq"*Xk*fZ 4}!PSZ!u]Wl-sХVݽ.(byd)LbvѳV+D֬rJ"j(B()6Ja-)x^A2V* S=VYQ8P{2[,xҁJKzv dA2X?`y=CWawe%,pjM&",0Nw +<>ڽ ؝׺Hܕ@: ^{@# vm^wK zk ! ./[7bTFeq ((BmLBPCXW/#-y"Z%*wLP 8@tҫ Tȑ&"n6uQ8?[oǒǿ]cb<,r<HKAW[Dꈒ{R$%M$*pb9S3S]uhdr[ҨyA1a.H딁Ycv65H@8KuF|d@!RP;H-2jXTuaE^Y`(#܁?Ubl,'R% fcI;aa,o=],gi&0QIPЄ7 BYU[fq/zTV֭BZz׵-Ia.y͸t,2 cm`HuKm0Zk1;fdsC KJ$5@ pY8 ^_7^} Glgkv>0:KGph7pܠP (j=Pȍh>boqݗӷޔn_&٫nA9fyt~OW߽_yGj>8pp%vŊںkSdlQxrjXD4R &6fдp~_Cgƺ΅cn{v#OO_wbvNEXQ5(AЂa0W|l7yst|;8 |? w~ vЊ/Onnf|`-mAVo.x_>ek&/)t~9ݷ@6():1+Q7wgwBN?W,~w8ᜅsniYӊs:m\=[e,>~aS7_ 55 jOf^6 3i#l *Ce\@Vu9(Bviv/nVmv(t mu~9.9_ EYc߽ɓس8RicPbbuPF!*ˇ| y*fC4̟\2@=]Sm~]6UjeKUd hVeS $hiJI6 my%,m׾,9kDDZ> h{Hzh<4ڧV^>~zMyZ'[n[.[ΗkŹϠW ΏN]cU|­R LACհU}-sn~0hǵ|~em4 J1=ۆ2m9r9 F O ݈'nF|[I+mp|[苗l˵ml ބr"kvTJ̹_) H4?^*aVy|qQ9$٧?ϟOMؿ&MIA68hib6=)Whl^PCդjRGGt--M>C9mtmqܯl{Sں+.'cZûWձ~^__z[{HPڻnqq}ȋ>\ЦkZUٖ2dh齳mXsKf9pZm|(˫qPDrg\.^]5ۛ"^sq2d  C~f9J6Ъ3+k.%G"^VR){od[mAQ*I%VhƊT ؔc`7svpg>$sQ]hO9նu: Ⱦ%8h<ն/ދ*O9YE-2׷þM=;{sC-Ǿ[r>DTme/25*yXx[(i^wkuS.sVlNgW<}B]P m,'\U4.8ι>/`_/NDe 4,dJDmb*} A[Z`2F!E X\#}QԶ?H"A̹?b$yǾQ[uڊ6)ػ<m%aPisy,=}d3Y" Z >jcCNs ʐVMʂR’Ed Ab[t̹?EDnoDԝ#Hʭqo&'_](^̫DI nb.NF%=Ohn~UWc9 lhq&昜*hPI ee\6sDfK(xqq6,!uv} Ew&+6ZkR^,>H6~.ku-YP .< LjyǾvvx! m|c7E E(I;Lُ4kyُc*vzRp,N:w4%JPVA\,35%J,;>|GOi;ƛRL8r edEk#e F*siiNb ܵ"/Q3cUDkmX84ApWiZ4) 0i%UEHI%[Xv M.wg3 BQCE KI$2Rtp0bM@IJ# s /[MSH|".?'y.^q?M- T2=@$Nɖ,a >D($ɞc;N3Cӽd{(?&sP)m.ϗ[$!ɗG@ryHvue eSD:C9:(‰w1NzW*78qkB˥CRwSr9 HT!* q"c?LK q8φ$(>47 Wr"@J :nRd]q4QӮ_W%/ J+B٤x2ɛypsbt=<-ճ3Rx:.CJ:2B9+o-iY[:Y Y +_,}J#F1~bf;s l[;U}t,0`Rէȯ%K.l01_cP)'J_'uי`9|zͫoߤ?8ݧ/| -6}|aۜ@qk҄ 3IQN) icN&zG-zT)..W>Tzɤ6M&udR+#(Mzevh/\ٶH랐BҪN9 s,&k/9HgGH(򹖜N)bxHx Z8h(Uª=SmUpԊTtNpԇAV/vWddJ&|pWS]:#0άBwg__H_';V%kx|s{NϿ?7K=b>HZ# ])7t -/(,+!Vqd(7R4 }<*r&^rjU~M&DA٫{x::Ez]~ndoAԝLIgtڊΖnwej$ڙ̮ԬI^GA$+$x;wut:s8ayPy0ۨB{ĪFx8`:<3tnP?.gV% Q[d<)V2""&ZH0<)c"763YѶYra.M{(v/4VR& 1T&pAjnDh:vF&rI󖯠_[#-k'h жf{2+8LYTpwb*KyM )c A%طQ ޗK}KZط6xۈT,j(\8KgR2D2*!JN4fH;u TrxaKmc}J4p0)&8 nPc4# p!h8T`$uhυV-ڸB(@+b* @1Z~UlFOkך(×SÐ6,wdk=ӈ&(Ob8FONAVBf8;ejyPӪ\Ndg'ٯ/eutus^uF u"Nʞ!ªlܜm[Ij۝L=ds' J+$'Ѱ~4yC@EƒzԅRxtBQbgq)kxafĒ>&%U~"񴗕iw"SarE!tEhLxݫa$,\/\Q .Z_RÉmo5b{+ZOxؿVFؘr+G!ʩ6;GTYʽgr |_<(:;p=9[OZ6B;ѤL}g{U8X"D ֠*嵧AG.tRZ*fSk ^B:#:y-Xhm@W̨HǤDH0XJB&2Mi9x)z|< qd71Eh/p/)J ,Idjڸf98sVOrşl@$8VYr``AiiP#Q=nx˃ >`z3V8<l y 4Qʇ@4z&rS&MokurL\]AF>ke}iчKP_OgF 3/FK4rƠH8 a#!Ÿt`h WB m#k"7$ ]p6%/T%Cs1fMEq=!"} TDpC,s$"|ٍFB~S . &O$g0URv@Iܾ"xW \ѩي6PMD\W{6/MtL~%(cq02/ϣAiUa*"\TO5eϫųRIgw)xÓ!jI;`eݬgg~]m;^=Uy9?}ɹW?T:e U&əG$7\JS zn8{˄޽.NO. w}?] ջOOo;P>~IK3qu2{s2 Iϓ;Oc"ɰ{;ߛ ]i/{Kz9ZG?a*De'3jqpQM@p޿, Lɪp}ɗ}nj!6g;1+24{IԻD^})V]>YEJ/)×@O1nQ/@9jN Sqk?{WH\ O^xG 4,DUF r4JUP`bBm MY5y"mf̂;;LJRD!Ch!2 f#Cj H6'ɂԅWG.M VdE҃#eP*CmRֵKUc;k&Άvշ{(j =VhJPLI xHQgIR" ]֑`-Mӏ3G~EPwaVi%W%0 Jr%%KkAxuvH/Hb[DV^dol"fǞE)ٻHb)ݰq^385;@xeB=K3GQR@cpu)'$],nGeƒDOZȶƇotP]HQ˘j}bI@ƷdȚɄ{PD3G;X.^\~=h![ZǬb3{7|.1B,{ z QN C3x+a-nm?79WJPO21V{ A5[Ҿ=IV*YI% ^\UR`m .-\!gk sxO^b d=> D?KUDJ%PkhdAcXLA-\NT4#j\'hzf[<9 M ٧$ O'DhnASbkaauqgy$8{,jyP"D]Qts)8FЏnɹ*{+Favf𗵿Xd`.U͈W'=|1PG_/W>\7YfeYNk<|M3{H2q:_fQCgez,ŝ75% #F?tZqO/1Wh/۬+ay9{ͮaa1o,GE6[_9ߥ>4=2O^a}=W?o~u}Qo7ۮq_]Y ›=W Yz9nUڴJBD_2|Sw\ڇvZ wk2fv͠/O'~r1D_3!|aJoF2Aynm>E VOyOd^a!u8KB2e˰I}'An_o%l3p iQۛW#v=<[GĻTe4ogW~v}rz>7ez|>_y4]}I)lA"fow=Mz962:Wiఈ9uvRju()*{VӛyKy'?&snz&كnLB!=h+rN@46Hu?k' o M=l~GƗge75jPUӶP^LHUZ.QA0yӥIޯO.|XGuө!`w>ؓhP_S2xd%=6ET~,v ^<]?SG3ul$%z88hqBb/"/̰4t%w8Loϯ^Ǭ6W~ WPp BRQSW EvuYjA* ђ)x; a{Ѐ55OGwGТY-t{/ *^;OP}٧<A̶ۘp 0o3 155|li.VIz +Er6!m3jqM*?:^.W2^$4tɘ؁! EN8f%T+5ov%u[:+1E*@],hO|\cSLI*#آNÙ 2K? KL h +>BE(`,x%t& k5_Tf:+p?_p`P@tR\+5ۺAgaU_ʧD/'{32Gp\9xHBۮE,3*0𺨒 C&Qe (3J?SAzF}wm -D'W٠ɸ?`Xw)BO3=*6*=ghծk ^yW 5^MH }! )岗Rz-MZ0ֻR! IŮ'Gq r@p9L PIYaffl uc__W_xKuE+=7l6:zC.>N~puu) kkzeE"JhnK@zme,>L}s%{!)VmjB2Nxd6ЖηJ.<L;k`ww "*|hR #!$59b bc7 JN b &uf^RѪ| GR1ID&م6g?I :Cc[h{D3x#>2k::1b! @J jTeN#US^ QI h:؄e~Ef+>Ff NB{fl>Tɼ#!u6ӒmmqmԲ9DeAtkhQy]?E_d>fڱ?t-Uدkw_1 = 7)+M~)t˜hښ%%7w9sVc&H [bD)︧xGxڢ)"e L#IO@R)ZLd][]$}=ٻ6dW(AH}ioY`s6Y!TK(R!)ރS=9%,ʞətOO]*p!q]<w!'S"C!"$-s@0d%f[{ ^w!r6Y0FQ=snލ^o/Kg« ե/kGdύՉ4 cm|&Q=#K`unj {d_F\ЉmdΆyBbZ10 %R$KW$ :ETV(cGBDden[{R* rZHmqY:MYwvUQ R pu"M4%,[^m+r:[45M 5SeaϖEr{!ڳ@]9H!9rӁU5 N1JH>Xdr u@q5*`@O{>6U.p^;~)bϭJH􅈧z`TP(8{Cm\wY>JsFHߋ(\.dp5Xl ; $b&-nRY UH*c&H'-ޙy Ћ^k .WFY^Nh.Ebt>.q*uQb5+UP鄨*\ ibFTWK+SUkj'H*/r?rQQ饏q=LTLsɤ?6̨mݓ N*yԬGϔ\{R0P2Lp%hI1P<)^錌!]8j8#aX*]ݪ= YС)RY>d5: 9.}cHD#jm^&+8T 4+mxw\ۣ)͜GW+*ɛ(H3-旗}mT9lq:8ŏ\wQNc+خ%HHF?^C>:}}G4;$>-E[3ZYHƍ,Xuai\F{I='abHblN(yYEU+7+m4Š93KH 9loba(nY}\bOeB2 Yl"Ykc!pa2w]6>"xG)I ~wkxMɓ~S{L9fArxbemUOӹ> P,kkAWg6n(ǠlߗPis&|'`vV]'V:iyl5|݂zG<uj5Rj+H]W5M尪n},G۫GQK:jJhxsh~05X^S{9vt7Öz7{Xxb[Y׷wCJDz-96{pbG{L d)mB[.{.AOz2P;hB6rŴ^xGDfMjɻڗ VW)uFx<-I0#E}2N!s]@ NM` 0!1YHý`pdYvHS>;'o\c,'DkD,h%U;^.Gjδ {&h\`7#h%&ˣ`&h?B8jd+yQ GOϪ]\bq U3կU=ZmOӪaG߾ɬzUq&YՓVi\M/Gxlcj_lu'X-]~U fBpjGk/%}l↋k  ?ݎn>R++ W~Jll.rg )Y,vpNmO jy:%<TI>PH6-pϫ9^6w^1\N_#z|5ݯ|j07bqruݗ?_=\ʝT (X86b6神2n+['wH6b^xC̢2*BۈY 嘋dn<}9fEbp;ӟe-Y?]cm[lRHX ڳoH$AR(*q#4T,o)0:NQDzeS" &Csc )OR6EmG+=~\v9,s(ΊZȟ:7;#9=u6Jګpʟ*˔͠9őLN0o$l bb9i QCOd_:B)CF@Fx3+26ϚBnnXtDulk0΁hM>;gU2:|^)+gMێf 0J8{u>FF<4(ec{c'gk_pyL~vf=t109j;Ѭ,j}횅: 1N&Mم~VE'g^ƬhK10&(l$Be!gͷ J)A9޼a0=ܟhhy(E Yd%>fB.Az_C&5 bYu *Xˣ3[ XAFǔeL`3Y*oڨbc;U`~*4 1Il" CgAxE)V"j͘*Fzey eg4bLX*a2kNIbbQ*"h wHy`ƽn}4L@3I %/wWzO=Sbh3~$74Y- 5xkSs7V\{iOs,=?ˮ{-bӵl윌 Fײl{LHƻg8kYl.OMJDQ^b !" GD?C<=)As|d,/j $(;胰f%x:$'NXZCOy0Lt=S.c8!y]uwg/e:1M}/MRe,rH3 I"P1gYkLL<됓z$&OSoz JBv!K͕&<1j)k0h51!S@P)O%} PKHAVKIw`Rddº.Ywv tTXYٲmdvJH>ݯ)ݮCtݷ)#:[/Fo[g.A&x2S"IJd)Uz t:XɃ"&X13RozZMOoy>;ȣ{̉|JfRQl jBd}6<<8,r1כ 3[D"x$׳L%# 1VdX8rpw{yQТEgw]ԃ@Qב A:^ˆ^\-D=s}f2j"L"( S2Hi6q溪5}(Ld7Z}L?NMD .f?>&ϹW/]`MGGtiF7iBPgW'ށTşbRNt6vmo귟^M^Q? e`W2W2HjC~No0K%ZKQjff *Ƶ_hRK@8/ d5v} nb-bءwSic,P?4t- xu |s囟IjڔS.>T`]>0 `Ƒ\jwh6y}iq]#jmjZmgohFo ݄1Y ՌMTUw㾹o^Xwɼ&ɒb_"Ҍ\JY|= D!{k, &"O2)1jp&ye4zl" +ҩAfmgtDDVsk#ۂ{t)%2A r? }7T[TqHJ?+^MpԃE'SAQ]ص TJD_NfIZYINGcùyMe![Pu c#sz5|w{Q;0f gE3,0WA_>TU1A.z7wһP\N4ta=i O ]zɤêMqKF&⿛H卅2Oj=VXqJ <Ϋ{<40妓̪c;C x-b@=OrN"`?7G޺&]G3gy UdݸC[g uvVp-KzvzI[Dn$m#+G&5 8ڔߦu 孯;[vM{E.2ڞZnpvPwyPȻm,-F Y [Hsl?mVL]QOĺ;ѯ8ɫimy&}6*4QhNHH)%BZRSg(YNNwZ\AKz(;q{xJ Q.Kc#(pR+}.wr9!#)%U[ۈ6]vfWM3L'MQ]l'+v{D5`fAG.tRʮiNES$xSX;y-8U{&PfTetۨ KIwP!"]WY߀'= j+{<>̕\WdϲKRYc>@sϲ?H$ \q,7]>^0U'bÜg؋VYJAiiP. =c&kXrߏUR- .=W#S_B~ E{Z<h/8* GqTǿ}@!h69gh>!ᓯ/F|wfG.?wVe{G12V[~+oe %7oe9[~+oe 2V[~+oe ᷲe 2V[~ A.nf2V[~+oe 2V[~+oe wic9o坸 2V[~+oe 2V,,Tsr d?;#xTf0HA7pZ mt>d6_)_<̤3Ͻ$^I̧D@^{-XYaa(C o~^ݭ|{v9uӠ~Mv=^71gʦsֽ^燳 HaHpupji1(f01x:ڪ”#RFhMt܁jM< S3ačw=O+3zSُm +p|q7snՋ~W=9|?X|}}Ѥ:]?_M4hrɳTj/2\pߧWo{Wm:\I(GӼ߰D~No0K%ZKQjff *Ƶ_hRK@8/ d5v}'0\9XQq׻ЇDr.+2Lc]׳uf)B zPe7XnPEB3lu^o&jK?ks݌y3\J`jpkblX1M%W,$'@\^hdd;?=.3kv/bzFW9[ uN?KKUAxnwka3kޓZ S6o[C^bvq3hf7d1M?W=0M7 L7}.-8&7[w _2ƼDPIjZ45pVΠMxflҀ6ݾy}DD[`ߣfZK ד .9~#Ų{a#W"{kEBU0e.qC!ݜV*Վ6:=a„,v{(8iC"akg"0A|dT1g:Qz](XgWr?TƒU_1x-G)av܅nP_Cag jmt ^z]0 ė(!QE#!a=rZܬvx{7ky}Y|=iY՜xj ~l(s3d[\ Ԝ+)8SkKa͓N=2LwQGo3rx7Q::O"peQ[`,)3)Xc Ȅ1FEkm#GJcqg0W! 63AGF<~òbَL["Y_*&1IX'~'\Pwt=,ǂUu6/&[f%W[mjZ B#&^X6[voې],yFH D#BqV 2/&lY(n4 ikx%ds8YcЀO184I1[ȒsP`CeT[dPH{()]ɺY5)n~_|>j [lW@эƱGG4W{CG&ٕWM>j ȯ ez)|9#$:zp$hp .I(x:eAI!(LD( T1L ) T7 8)5خ5s-Wy(\<4EH2a 9UN$SZ"*Ӝso9*[8 ӣ o]?CM4l Y ieps ^rHtu&]ҌUxQ[ҢІr Zb;{/j∾}զ{Q[fe$de e f<3X*D\#JR(IJR \g@E"/)Y5aanVƈȹ#n/i}:Rsﲼ}d{<;|<{:Ep|7j~ >zH$C`>HjN9SRѭUCi=@A77ӈ݉T2(.Q*%= g3Z!7YJrP>R^"~?jecˇ}>̪0p rC"9X+!VNGᤊ/ZZYklIn-Q)ߴw_8+^?k)EZqqyd!74 cm|&F[( ?#9H7qt B2koOFl*IWK)E%d\7U!Je `[BR* r ^ Xʘ9t,WOwYd@^8u g-y{aW7Qח[7zl˴\:<#zmM 5SőBw7%s^%; <: 2G_urv62v򎜰DҮQdd-:PYù!@OZ"^^11 Ee[exDa GD_"KeḲ7&I>>wY>JsFH߳)sC[v% jt:.*wt9(ȾX~[?#N3 7ǖMxV]S9aW]gd+ɄdRO7}햩6lvj]AtN9<V][ !Vͫ.^jp8z]y+[uuK'ۚۛjk3KUѸQybǣrmN:7/(se]겓ZWaMe2RV>?'禬?ㇼcR׼TO6`41}W?}?}υ}^%YF2 H$N_4߯M4lZZEb7۽u l\R:o^ҏi,;mPC^ )jsɢLk絖xŶpDl7ە`/ NXJ8z&1WIpE`]T,&gl,\.Y %YK;{?ʥ>.:yfKƖ넒g#R p Z0f1pM%2fT٫̖xgywɧ*eXՇ?5Qp{78Ik'n㴽)r5Dhga~EE/d:_ra:؁l3g~ҷEhтuyRy nʖϭ=f~xif/h0z3u4活MMFӹL<Wo~Y=o 6G&]Ud|vؼ&L> oL4xnwK|~Z9ʵG>j=`mž*#Xyr 6l(m(it,eoc9ViAjٵ k 2 -Ch wFͭ\YN~HXqsE"hA|Iʚ6 ZoБp'c7>G[ $ϗw4~%p[ɝvIgn LYKQ[˒2:W[+^Н n1lIoa{8PpC_}7i-̄?*@;&1˼eQ@֦,[.*-/󭦡{QjYdYA<&紴,h3ABcQ+T@ &֭3d=VYb eHч8rj:%g2'}IT&]y 9 VVeS^;]7Iu[Fq`%/z @cd9:'2[-q>$ *[I@ЋXʺ^I%JE"fAţI`K'm uR ޜKDn5>)"2*{D};;Mλ3[|jݘ7'g{X7_wÛ9.Y{K>/?݌Z8[,E3gH*!'OA9`KTפdu!&C`e[ qAV]c97J.VϹII֌ȹY3*ta5UʺP^uኢҷYA[ovˀ7IO㢒Ճ?p05يDp^Yɰ$hTVt%ɡ*jB%L9y6ʁ'}$ AHQԦԆZ du%9bfځ'ȹYc(qǮZ*kmv`xiE9$3œbs2PKβh g$)VՇYic(C&dȡxɯ Y#,9̇Ubj4WևȹYvN2|QǮQUֈ׈FsCtgt*.[N:σKA/ YΘ&5Z/k  EτԨhc$OZp%FF-*#Mu 8mµOkոdW+E^/^]%L`;ЊA4ޕMeJ26 kBSЋqǮTևf7}x*luwMϧ3R!EΔ̕gDOèBƨElɭv-j 9F0`];1KRj1[ř%@nddZI KAd .q[ yH\EE3B]HIՄȬSDK2еȹ@/>5Xj>Z626ݾJ8?$3݆5'aFɞQkW{=z `GyALW *@(WlJEQMnUY Bh}5ʕJ!EQؼGgvv(J*dgR 8&9.tJ lc aB&+U7yvdٕh$r>io-8 Z30 ZDqg5r*A9&D]"oFHB֚`Q[Y ݋g =W6)+ς7-~:#308b66Kߛv*uoMMzG|LOǓYL*:If:D oVLi^oBq8,58:=ݟoZ;{YБ~ ?^;4/uv<@[Ѱ%b#KNi۶|`~md)V9]ڌ):bRIXҐC=}8h[]ɛi 1K*./ ywX!~u:ufuzq÷?X9 J qò3;=1HZ@Pp[50ίdM]E&xB$oRf%xY =?aIŰC~VI6y%'5qnGq\Aq ۭqR*rbD"+!(bjMbSʭjO}VhB 9!*T8bCJ D H.hH-m omȉm-V{ϴgXMf.tj,-)GD<LCoXG3"M( o]h4A6$4o)T1at3}yȂ3)ir0Z!@t2M" |{{dln|kRՠht;tu3f}}lؖ׏7{LP%]%l "lhth0ݨX57uNz kYK@J2e(K]D]PT0YCYKݞwDȚj>JO#Hs 1z @ٻ"hPb:x2af˃.{,M"z38m=ó/,Tdەh%y@]T!(;5TFiOd jV/)( 8Fi9cCL ] K颉pίFgz-^mYgtyqݞXKx=:y2ǥ8nkTdګftV:e91a7,MTJ*kB)%j#ֲ;NC]ɹ(ꊷ IŮRevr@p:Hi vR]{y3vU:iƶPw u w gpHӋ2&דt9 MK@7<֟ck]PE5 D-9<IQdSlj'}8셬QiUݦ{z=%Qa׉']ǎQ^ 1kwzmkC{02,ksRTEO%HQl[Rb )2#C 5q$U0YTIUE,Bv& kyۋL1GL?chz{ćYӽ*hO` }g!*%DSK:2I'SQ a{HK:؄e~&ȒNđj[3q6(&|B#I33}ZLKch{kEHLV0>H.K5n 3Z!7]}iǶuvp.Ď nkr=v|t5Elm1{ ! g~fjbh ˒.]٠/SR씗gs̶<减qzOC2uXs~s"̤>Igfzgx*zRt6jl CepM UJ%E i=KBGB R lf$ cp^fmq"XmNCI*Pf)ET2=Asa> }M{*CT ,׼[-+t>0:YB*z"+LNZU2)fO{16]{ 9%]4s wy}JnI9gtI<+؂,tR: ld>^j/T{23ҥQ{*X7Q@ׄPJD D D1d-!ikNЫ-BkPܐ}%I XԬH-jC0Z".Ui(dzC֐W #b碔)W\nA!_֫^º-7•1n#W`/Wm]'tK ʩQ]K+r0P]P bO2sԂj /Gua2uvͥV30iu O8@]J :80Y^L. 7kJb I_pHk-< ];ZDiakv1:ߏC<}hW~3?zz>MkCu;" ϿbQkqH&֍w]5jHgu!b; .SiFjAH#,>j)C>ZϽ0w.Mk=E=/$ͤe 2W ,#ޒ#4Rb.8F_ SN'::ߗtb{;:J[OlOŦڬ;?#Iۗ/:{b7@.[`ꖘP$Ñd{P7C-jլ~V;(u?;U50=^w ,H40+P'jJ)0z< }HSgcE09ۓ@PL&)%7>%R#V,C >ӄ` 5xg`υ&\H)2k xz& ĽIv CΙ( hcq4(M4q 4dLOU>0kΰdCB3C'pb*D)#8EdL^Y,8#Ai{g'G"ibJ<))rcVr^K"4wL+|_LU ^='JP9/d;gqgGژc۽;;Bura Z ×ub\Sb\')M $\f^@9zaiJh ZJ20EX q!z8\Q4'*RzɀDÅI)t &)DY~U4l/\ػ(+\ ŋw:or4.UUtˢq.Wsb(`\T'͟GEqWB/^Xū]͋,ɋŋZ|C@LAX,/jDUR%NxM㺎GX .7|o˽ 5[ɫ㑡V/O k1T*[E{8C\fT2IB:ʬ:a8oQl|F5.!Ҧ:*1"D!P=3aQ~W΍م曛~dh00N mR Ҟ5 X8 {TDjJК 3ukZɄɕ}ͽ'Ut-?DsgWfEKZwpO(>eEB'~ L;ND$僢<3n&7(+5:F'hu{:D~2u:@4)&ov~\փM+g+cxY^e%?y k9wQĤz|d␂IBsF4eYF'm#< IdmG1h}`.A58$iKrH+h0T(BR*͜@j'ϧ[e)ٌ$*D?2Ʌ̽7L[+Ix(ɹdY$P=֫-D ANU9LԂ8$hmcB9r!8wQHGy> :v@:v)_=Ҕ;fծ@[sg;i}e*.Zml,RNK+-]hщ 﫰:;Ъp.guWW)ބFa ik|ѠijwaW:ϛUxvt\ =7Y;vkEƨ\N\vk6w{g( ){l+ɽuϭYp-lM5o[lSnztv'nl_܇wiKnisncgKL)u@L 媫VjJt^/y8=)^tc0(pr&D9 :P &p%NI^C!wft8g{}<!>v%6 ͮ0>@.gmpC";m& քp'R8a%yI*VJAF8L"(D]60X*7Y-ZG]™hepٔV$:5c2(PR Ěٮ+q}O-~&qO1s%gH2Tvϐ~45~GJwGNZ*oC)'80l<ԏG`Rj&II=1؃,WA 1If|dj=nF<ˌQoއ\/J@wÁ_sN4p>N.v=o#_dZly_pp)(p~ș*@J"<` ΑĄ):M'p4Bob\^& Co0`G@U'Scڀ#.Őt=;B tL(S *5gh$gJ:zT0ֱaLx^ox}H~"pC,wv`Z%ʁ0lx]9cn:yy`LD2́I%cE'])}%/ !3^Coz֚|qG5"KhNZK>$P+er# ?Zee-=-O. ?[*"ӛ;-ǓzE0/GƙH2"A` p x5L+F33eδ Hke - 6> u v݅/uaij7v7/܌{٬!hƘ``D$')QPfL^rJ=3~=نzB'oN@Oț|+Bx X{7swhY,f z$FoN/>o^L Fg[BUJG7l]1/bNk?}cb0ʴnk[Md?ۓx#KWxQݹ"d~녛?],KiݺaS'%^U?w"֗cݻtۼWI=+YsxTGmj,'C7hr"wrm/gAXJ)nt}?d!A=|oƀ_["*>U-!tpw4ΛqJ o7:u<ݨɷ.KM]ux<%heKq3@sWޫ00B=иϖۭ;f``cRig`$3ɼ3l)oc^4i7>tvdh¦A]~辻@ȵ+g.y%?a|, xDdT:'6qN >fS:#EutA30<ϛ7iEAxos@(,2ꀃwD b7"8^'F*^^M>u?uۼqkۜm~@N}߅\=4t V=<%G)_O곮(KYVg]7WMa:RRGKuz?[m8i8:ڌȢ٭lv.ty5U-Yh\|wմ>f᥊7̉GqkVt|=Hi??imDI>`˘T׃Tz|B๶}UIzhÉ:Y O>Z ނ #@¼T)5P.)hRmDuPFk!h TB%h}4b q$E,s*Ҝ 5L>*)%Y'%p}$|QB9Ϩ$u@ŪniU |T%-tb;cB*(޵6#"eviી9to0=aP`a%nK*`,[-Ӷ\Eð%e*dFT&; GdBfwfPjRD!MdMfV$R{j_0\$ ώܖeD u 1e,Y:X *\ 01 W\CDXϚugK=@V!{fBfF}( El <ϠSE"l!RE$THcW*"KC~| #4-PFk!ZdKJk5A8',~R>J@"# IPdk!;eKv6`_I:ѕ6cVFN+ += ?$P(*=-I.#1)`Έe< @5:e97VN36T|]|:VaDLE֊.&L`eI!sRFvngxe_E>8|umSV5vͺE [i(IjgS=O{SM{UDW_Y~FU #-'CG7᝟]|\4 (T<^^8 䳁#$Tn4AQ(kYebQ}%RXfcج;cD O{"ӡt}J6Vs2\vI$~ܦQQ5' Rg?r5Nzڗ ',k\ OщPYJqJ@&4(kɸ]ܫzdۄ2NVHHHH䎎VuFrg$wFrg$wFrg$wFrg$wFrg$Eg$wBHg$wFrg$wFrg$wFrg$wbZ@wFrɝU;#3;#3;#3;#3wHHHH ɝNrBw'a/Oq_+YA9AR(؆D*66٘6mVR8 {$JN$HK kr(0e:*uXMn[}7tSqy}.ky<{>dЇ8s*||yt2DX"j?Edt:!K,F䇢"v>fcАR~~P锭} ''d>I-7 ֻ^?->@/{K?^ (AtLA[c0'Lu@r4<{mPF( mHwY֫/y)~u9Ns+Oy4>@}<ΓьϫGR_FA>ǦGx6|}lԾ4_J*06hkupՙĜ7Cm!F'5LpxZ[u3|z/V|zԂ1>\BZ@0d)tfN@.D4a@0T`P|F^R[%c#c="cȅO y"(eLp$K :-&YUE`ƶvl!grusN(@%btJ@SBC%eug{M>3w_rl7$>&y2G0웃$X'dWG/U2 Eaސ4Ǭ~W)kqSb% 48ҐNTK5C[uaEG-|1{}oh&';7D%oX_o_Og'LI5,C!]o_ry+Ugl /ՌVa&P 5/GH \& ,)keR:DN!P0l#8[R*DFސTlzrti'BN 6H Rj Tkج;[vX-l&Bhl '{]Vd88N?M㲞݃8at[lyU>h"WdE"0*fKP:O뛫@@U rdv2¾paћsѺ։'ŦIbn&Zmll[`xiaMR -!$,U{,AhlDJZbS{X!Cfd(zٯT)hTV.l@$";ن6pԯH0El&ZD"nEs, ك|#R#ʠjjBd }1V26>T;Cl=ik[cج;[?tX*>:Iɮv4n] Ж5HtRF3>6ZWT  3:˨n.=l&Cz%6kM?Mgt>G> |௸"bE\~|GlTɔ1b 9M;Mp9Vu* Sb"Sާw=߱;>21&PNٍ$U!0`mJӆju&Emt 'R|p$$uQM@)a`SCQ8n}IJkݪ;[lԸc ][o⇼ L7a9fT^!zuכj+xεǫ+EJ?oUd{nIrI}  *2x+ʟv7QMb$ˎ2Y0Ŋ`4%a :Ge#H|`AJm#a` )m 9KDbp@p@gF>Hôͺ%fmM''q)eETtu&tC(5gMs|vt~'%Z XB|F躣APf^8V;y#B&b箠 {Y!9kw=5m\9V`kZ %,F[L,g9"djEpmǹ@U'GlW5{Zn#Qi9 ԄP 9 )[%7YD" BI\S6<-y%J GT  i)89đa-BiSQŨuWBb g;KRFr _q *E'L Ô•[.7o85Σ3ZO4h#Gty2NrRci5pg>E."%gY~( ߋ2԰:2ֿ4_owwu? yH>A 88a:ͧ4t0<=c?,;#\,3D!2\iFa]]޺I|?NW)ʓYztvRN0itd9J<^[^on޼ /?FT򱫻ĭ&W׿ִ~Q@f~kK+k+(5F7iyhqz y#?|w9;^~:83Yw4~2)㓣؂kOyIaq ;ZB-Mͨ̀ua1FQ3di5ӫ6ONOfܢ㝭*q{FJGjÚ燳X/}9eUF5|(lŽEsQ2N&ãxr~_ߟ{T~YO<kͶM].Q@|i-j[5 hڹw!JI\qsAx7&/'XViՇq5{-ar]4JpYN^k]qKl o[> vkmFJ8`Kapw.؉1 YD$ò,O˖ar7MU*0@FXb\Nv#nj#Fe'~Jn'EˑE`!( aYDglIYZ aY%;;;sR7!:KƖkLg#z&cdU\ cJd_2-cݹN+:P5x |Ke. }}ĉ &:ˬDd%,Q"&A4L<\V¿ҙ&n{+3q =91G{kv=6%EZsLv.D+j)e;];91 Uf_=.->shuJĸ'+#$dC,J@'cN`C괝V3wr_]h{l;DHp'@Z3Rx-.FvuI («T^t}k% UV7qD3QR=8wChLPOA󯓟ۨcjAjYMQՖP'x6xOSPU' .S`mQjdx~؞>uy:*@Z!emD$pZx[E#g>n FYCiPZ z6gϥ}Nm1:'&Zbv"`NZ9$T ࢷ$Ϝ^cڧ2)*Vݛ1N>'u7cVl n=1Uv9K%>iz%; H*E3njP /du̿bn GI0h.u 2 nSDx!#W9YOaen[YtWBԉ'\ Xn7G==&&-ͧm)H1FBi3\6V1B>vAU8$)P xrR h5s@rf*cv{e`ՒlV1dcD8fBB"њ}v" =&-f6`>Ƹ^}>\f IddS?sY{)w̪]֟t{Ot,@q6Z8KX+yg^&GFDX}_Ł7Vօs<㩻p>Pd)B"s:H 1*'0x։ B1)ʢ2[R\>IީMm7j,d3MBtŸ\KƂ* tde2'T~vj͎GoI+Uoz8}8憎$-=G=tQȤ x2 x|9Z,qn/˽rjt6F:ij لJIdu&; 8&izA~Obsl3a8IlYf7f½XdB` )-gTsneWowJ@:UW'ra\aʞȪfxOW 4oW!7!fQ"f xǜAW/`"aΓlYb{ cDw(WJfuIH` z5olч^7uYoa߬oar.^$Ȓ*kǩLV&C`G5ۧ.3C"\,7ŽhtM/r&LsOo^ؾ̈́4[wvޙf-O;̼4r;?-n<.tWiiL}{cfH\o?XO$?~yڞf4L\xNPwg+cCd6$JU]ӺvF?/!9.{nl嫳Ym)ٮ,˧,+g[OCF"w4<1-ؑ"zM:XC ƣJ&k;;ڑH s,I=5؃ &Vl lHB3t] -XpZLI8N%}ͦtH[> t Ωg&ď^c/W8t{_;rcp(YgZH2LdlGj>:Ƣ, \x-DY'4^4BBsb)+48Bv!K͕ƢgxR,)Szz8b$! ЗGD0)3XN%T)Hk)).pQThY1FuJ=N,4 J'1Fó2A#LQ }\oxu]2=|GV<ޫ̔@)<ڬD2%|ggbBXڡcO;g8> mΓGv'%Ԇ/<tw 1hx;E!%bRfT 6$gĖ^HrvkbK?7^]6@'7OW8S4N'oޕU,àz{uwWSm5F 5$'kA.pgՏ^ոovxQD6{íNFzJ%ڲGkɛĦx̄}f;.a8׿n2.eoLw,#}XM1;׳fw\FL g rv$9RG ixx9s2÷?~]uydyMdg_W/hſE"Y//?H!h?}νHg}bo] ͥ?%_꿝Wn.I){VMU.bAywolzT 5@ g&AϏ~KYv^\nj]ORpRC!RjuZ76W 6 }k3}z:$$;c-U~+1> rƂw5<0 X71YTUwn)FM$_ x7!Yv>_imVAo]oY A=03c7{RonÞt[S A{&iV#b^`TIb$h &N4ɬF1Î*S|4Yr"t^zf+79K.}Q\ KDd3Xt1b$\P7(c P^FE㔟XZ:k!(M\G1*+r0#16siQ") B\+)V^-Փty5DtXykqmvt.|'bDR. 4 ^W=<C_'uYOu-JOY~8x .IXBuPH Ҹ:?LN":^ `ޡF<@&2DY:[fT8smg^iЙN2lp9! @bۅyZ#_]ko\Gr+>{#@X0 ^4fA{N͋+WUb% 3uNu7  }F=[' k=3t6_~$)> I#:tRvan4Psűo=JE4\j5"iwW`s4)R(]G'0zj5CD#paK?hd]1fC57}bf# S2gs8?~avXQgҍ3iS; L?{ pr4LhШF3O\TVMf 5Co=pSwM;m(k.HYF$Kζ;>CGPjgI=-js APM p<Z&6FPsXxuڨ+dz['^sZskyOw3idZAXFl B-هFϜ+͐5[mo/FΒ`M" r#<XzsKu`\!Rw3m -Ѫ/e SX/|YC̍7חozo&  _7.,jXO~9o_x'W?yb=.ȰoD' zw^ҋ]>~۹<\v~zjC۽fޅ]u^Fiu۶S\^ |y/_%`l0~̨Q]ɤL.t>i:Ǚ8i6K\q2AuftfטG$,98b7p3y#$^Gz,{:˿m.e$mbV0qM2q 2_Y$?Hlǘ+RvW{w ~~;-o\ւqr)^w$_fp.9{$s٦~7Ook!Zfeq}`hˁmEP&s<1nOz;dXde]GVٯJ2޿C~ &%SS.SVбHb/Nd{}ѽ+P:.t~s[oUdZtWVD߼YɘkVƛ.wݺ_Nrz‡?HVZyS/ ͉\c?)xz1 Htkiw^]ɺ:iNВ0>U|m-:@rcaY8y͇3S14vm}X(N:B|ÆDx]fd7ۂ!-QJQyOro"Bu"*!޳rxz%w^"_x, =HFGr=& \CO, sݚG}ﮖa5Z -Ȕn(T>Zki9J{ٴ4;Czk><؟Z+5c1NDct̒dd]zy;a~{S"eOhbډ>dl2)4cEn1 x[L,*p Ak!N8%VN2Hq6Wu&ʅC ]_2xWm2c/pm4CK"uU4ƅeDL6πkK<.H *&UِD4k5u`ѮZjIK޽%#Z[2 P)i8`kmNIjD2 7X.' 84  /A d4`bUG9Q529 h@e֊o;=J@)zzfjFi. jorwvC |v|%%0#C%h_B]66b({<.`t:Zm۴W㈹A*u'  \J,Ќы6r. MgӪGŬdllyHYDp4d >+mWyM Ap0A92'F= ʍ=|=H%\+#ܩ!S)cٵ\=@Olr <UߺYOa/h*tOw "8$`[r2@FX,[zUuv4e1_N6dj"`&E4&T$BePbI[A$di`Ml(9t ,+Ay=D9` }$@\ߞϋֵJچninc@וi(onz3Uݔh >iVߪCZw%۳r#x]>9Hh9;UH@R%*T J UH@R%*T J UH@R%*T J UH@R%*T J UH@R%*T J UH@R%*>[%P# @08b@b.k=w%X*>G%WUH@R%*T J UH@R%*T J UH@R%*T J UH@R%*T J UH@R%*T J UH@R%*>_%z`Qr@&?{%AU}J a3R%*T J UH@R%*T J UH@R%*T J Uٻ6$  ;T:ϛlA$OkjpHa[$ȢY ~]6j#H6j#H6j#H6j#H6j#H6j#H`y=Z/e"QWaѹ~Pۅ>sq1Z%B: qyњKQTd)kO|\]W{쏇.<ڧ^!`O{v V6JQf͙"<)8INԒiU_ӏyAKl1$ۑ*owؔp{*X{mE)y?C߮,TkѲh KڮIsbG#*̊IcPFxG{,_Hn ,>#j/6UJ.?Ky/UN,M#ʎ)dh_޸V2Wn2yLW4bd73^wx8\r{u<674=3<7,Rf Jgm{iS!Sb[uCifC?Ơ+n@JUe4Yz ?c'oƣb"|׽.?GU17Mn//B{b>{b|}O!|c76EQ0i&D>Ѷ1W:^7Ub0#ue]X^K_1KA縃4ur HQȃhx[;S8LK^?Oarr/G8/I Dt0]ߺ-ߙ);=uP6VX@q,) NґL ($BH8aX}u:a4ɾ:1X:k@;4"Zﮜ۝zg-\ZH KJjRvTfIJotX J]=ih$=F]w2UAULM֜ ~Ӏ0EN93X?p2ف瀺 eG4ֻ(3O~=h׎l}^R]N0&cI#Rf0C)w4p&C4#qp*N3à!mx^gFmdCo.=KsiAǓ"[ 9g0*Lg9 ƹ"mrx+e>tF Հ6UXJ>}10dw.~-: _,m,P]fwM6.*%5a췍u6tcjqwH/ax (f`k)u"dh՞{:FY)=‚SG1t\w2s9_]w7a!qWRކOZgw7F^Rw|N_:YmJwWO96-}6?]8'ѾNe#ǭcؐ~vg%piX̓N^6뤥< KW0IATLK/լL[v-{&DŽRq7 ˍNXoGC4rn8GN1jT4'RdZhl,c>uZx@>Xii S j9s;2T:TZKKY$L3B1F: x$):o;cƌ ]{e4hj4"չl 1rv*OTqfaةz2(-u$)h`XػTYIL;$jt6Q·9op=lanbܟwCc|rw2*9޻an{hx_9!npѺ|/'GU `69yHKS^ؚmFRqh[^)#K5 XNdPcE*y8r:jT2 %ZNk=~Gm7 ,3AH5-#qViL6Bڰ,,|RY8(m4N|=Iqt^yrzrؔFE4P/6\QTt(& N*Aa>`dOP&zOpH/#vH1tR5x9Kl/y*R18Tj6kv`qK_Vy_m#wʪz' \nNp3y?rE06X4?ƙw3E"8aT9:gZsDtN`CTwlw4YJ!*Q@De;DIqaA!PhXx,z̝vX-"rò)a-^4nD.Sg)A]XJCP|߽uL kN#?%t䣺|-Ouԡ< ūτW%'B?M9y|*0%Զ p+B6x QhL$xyJ1k5oTç}u"҅w"E f *H%KC!0&5qid60 {0՝N_|}x;֗aڽK( { ]&ƀdHg&A[ԲFCdOVSjvUWz@[Fvӽ=9ɹt~09PTOo5ϾqX&$NkDM مT} XKPY)%mș(B%k]Hq7oۋUtm,ib"ZJ/y,$l1()%2I3Ϫ`-xoɜcQ=c ~+EoȢ֌7 .} /\h&+qb`!$W* e J*NI"2R` 抠Cʕ\WW4d7mt( Z;ݭPr3c391gK\ q,*D3}}VD|`&+E< squ0.qtדt t?|7]O Y}.ɏ^ K/'!.b u{$Y;9q0NMhJ\UC=:Qŵ4Y# x1793F`B>#-U|Lp7ĸ'#-$dC,JVC'[X`CLN3{*gz~>9nw6ڻVl=P d= ;F)M#@jry+.FvMI4#qFA^XDe/eeUxVIXzGM;f8c+!՞ 0 |W r梤i>]G/NQ)΂,eװ6j,KI>m*-%{JqvvOJݥhk}QOdn؎=uD DA€ʐ6`Yq\8-̡h"2$8EC y _.Mڕ$ҡ2˥hDǙu0L"j_`9hL.$cN#ݞJ6[U#i\Ut!Es*ܴ 3Qa%܉ _kCUv4r.ia6O:̥ȵb<1HQ e^tdNV#%6#qBȜY*8T__VF 4piwtڤ- =^!v#'n 61ʹE)uHWjl(g\Ĭe:Fˇ K Mπ21 Mi,S6!fJ2"pR7eg?t݁Oy=yڜe~Zׅgukpľ *xM1hpZ7Hߐ%Â'nG8{>Ê8Ve(jtȴyE9G "J:T`0ceOΪ>왜&.ouK;dx3'cа#6l6j.-EN c҃ră6[ブeFkŀeg,xmVE+j^b0 ǪW9Ns>I69.-<7F$T9!"yZW |]kVzqڦvEoYDz$<œnY? 9?[l`G5Nb!&Z+.*/ԎgGO(췐ۖϲZ` Phf ru $S*4nޅnJv{!qN=0%~Z[oA_kZw =v>W}.%!|v8$gjƙV(ۑ=<^ 4y! {PxJKR Vhp 8pB+'E|ХX|S$(8b$!u]EL 9.` 84F ZJR2.pQTAks$R^͔|< h|Z[=[;hӑZ@ ẙ{(GRO^@Q)pz*8ڳxRϲ;HI5qgi'B# ѧ1.h'$5z&^;>MG PSD}(m .G6'^ċl5hzjԏ>$ _]?YyBz? AqsKy-ڙ@jIl?㏮w-]WmesyG1@*Gk+S]4ΪG؂,h=/b&W)]q-mH?]QzW.H7[ѯ~fyN:$$cθ+1>J h()<8Ro-0g]_?&|5KI0_:&~zx| Q?9e#v|=̀7`s=)jy5 3%xcf牔+W`.`mR!h>p$kD, t,F$6Lf;Do3ذZb杺.;NBD /I 4_IhƳIj_WQ,L;C !E! <B2ō`#XQGu>//\0/@v.exd"g8Cm37%@ -Dϵҫ |T/jU?`h ^W=<RI~O곮uy=o!6.*$a+)T*6i|ذRT${HIػKQ߂Ro)۲Ֆtzgf;H.XZ@t̗=)㔲F+KL,g!13WM`ѩԱ\HV8YU7 ErD^Id#Xe>w=b{t't3ttk5+ 24iG?{WǑdJvyر+K"<Ɩew٤XQ"N6Hͨo0K0<'RgL!:V+]uOiN?2$Mz H8U mZUNCK*p E%PwŻao AhH~`pb/k7spG}UZGųrxjB3Ge>,[(-S)mIe"b70㘼'/ùy͂%\|ZvՁ"KŤj3ȲեzdwlrDˬ,FE5wb#5G٨{r3 d rsf89sD B?GlvO soz ^SǠ7Q(Һ #NʲZ[J¯ѩy 9yPF@4b_>Ȗ (%IMm̥&S){N*񬂴)-lL9D;rz'?aT$!=Q b)X]6rI1x0W5ڴ[۰紘5EmT\_:FWґrh?E:[zv꠩ X-N䰤߷sp*~=;,. g!^u2n]6w3f'GFVYgQqwuc,dH ,]Z@=mo圄B'8pR%/*PsFUjt8t;|3,N65Αa*^]t ] s;z~i%_ JWw~ZyȋMwkgwNώO XMcٳUsEY^WҰ/~G@'ԞfX>^l̿=~=Ү6$eҖ J+Gͬ&3Ng TmqˠjZVI:P Qi9B}@2JE/2qfؤL&5˴1fN2=f%YU3-sFFCW'9ޑ5>ɖwRz2l_MnwU;,{=3y_Qy!=gvF] }o鯋 _-c> W&\T'sKc7݁f aMKH,Y;\9ˊm9=vg~nY~yr!JAEtb_Y*S c6ý>X _[޺h~hx?ǟgR=~O<'gQZp|z X_(+wqNZ*i*hrɜm[[Zg}}I /4+wz>{}sM:g@|s~d}29(ކU:ꬿ|m`:N93䱮qPJwAg2}je9 ; ?]Sn5Qu*te'N3#@Q\Xy\u<0ilwϱkF[&LʽȞ3Ṙ)V1ZbЪva]L.ErVu:;n.vY%*-o~GOn ރ27wn{\˯Ӽ[kaKL<&R`%XbK!Hٙ]?IH B-Z%E^lKJbMON߫$?0w:L fcq^k-Š *CM :Z0K6Ɉj(7m[ҒD+ɸIk1h龎>|~֘sI:XuL40YM`R\=E*bĚkYz_̈́،q 8)J#M!DŻ#Lţ!#E2uf-CۘR:q[wj%cM;Sj* ʿ-bw!\1XUl|˱y!pώjG/Gp˨+G遠Aӏ[VdӖz'dL.D1jӇsL^ipt_UY{US'}Iڋ4pNU,`U1@E?yamF0'樹zJt5j -Ւ9K j `ʷ iYɔҒ<ʥ TCH0@mT/ 3%\]ƥAJT&zNYb9@x :c ɨ\U^&VX"|^IcUAz֑J7b]o,醸0=UFGLxgPh L.rNc'Xc[1`Q n<)p-K]K :8Y P+}(qX y:7q,jFӔX+:+tƅ%duls]o8@+)FJ.+-Ue ݇;)jS>+W/\\ /k `%Qp5\yoWj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(W`O/*@9Sps\ڨ}UPk,g\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 Fտػ6$U{`m]\l@; `Sb"Z) ݯj8|H")Jfeʹz]*********' ~dR%pD5]]ǍgtpՇqajσ\TѥoG'WZΑJA@} mտVg"k?SxXwR=gZ}Yb$VVKQcxXS3EbwU}h3[3maѵE'ULUޯ(0ōUm3 pVU- ABe&~.JXrV"  IJcpBˉpw ťd^}s=.ɫ@Olt he~` %/%P[J)Ua+݇\)[((|zn߃VVȰAlIq_fc}60b?T龨*RuYqxF;PAB'}n"+]#;~rjrSoiZ&I4{]5Xsziɹ3Xi[{;-Ґ&8!6a S!X.d &Ne]峟Fu놱aӟ?"̏[:On=)vMvhQ*iY,^"!ԏG]ӯn$Fy0WÓ%XHFLQQE)"a^j* n#­rIfAXJ,]jͻ>>39.mAɓC 0Nm9JTlސנٶ^|;m^߽YbU2uD鼟i_mmb+ c|հfWAp/s2ډ bʲҍxrHF4ܬD9Y:K_]"#jK(~x3]H.D" {pX̞$X{L ++T$!:ӭy4{GmQ,t}0*|c᳭m!L$b`1d34D!|pKf"s‚Ut.9L. dcUaxP3Q:$V:MI5cTMm2]to~7iP/YͭKF0[0Vv.<eGˎ< لIcy&mGUQ$MTS ie1'RNvu-46Mlǰebj gTXR=aPjve3>. &i{ƪ FZ1`8晓 Utxԭȹ\& fgkA`ez駽'\:`>L#t3Oq|{lK.=V;.>Li{6!ﴹpn>z>(bI!J+*a殂q5񒗝=3i~'.J c 3!)_OJzS# Knol:&rx'#:܉댜h]k>|9=K{U"MbJ+JdLD#˛!k#]տպ8餕6T{R$OƔ8 #őtU{dVNzog򳸩t}嬯YuOL'}%f RD5iFL >1EzS+%nGCrnNL>.~9u۠~m~7Z:5k{Ms}`GQE"NRbP5,!fʜbQH9:$_4ByJ%)Ô2C*3WTZMt] 7| B $-vpE!FS&FU) x%6jh-LT{*E0ֱb茜G_4j s>IO>oc46 & U6LjXc1ݬޘ.&5Ha{-4gMzOzoOg>4-+Sgl6dQ}$hpTJ!,O߼^p5q\_ 6d, _᡺H@]?]acPݹo5$6>}Z֜׺wjZrp1뭞b{ņ&ތ R\Gv(vGxԜmUجjr)ށ }lnTTL_sO-\ӣ77fayS(19D\=T-!t*4k Z, ݉fU  w7Ü{`ӦC:ӯS͚|NM2_K\}px5P?96V-fMb5io\=ІHW`Hg  LR[#bV`mUIb$ΠMNxS6io`~>nYbŃy-M$] .P;s3 fó aZbT>K?Qkc>(BQ(cD9a\<`^t<Ȓ}ͦ|}`G>4/@. (1 bL:B%X̂UX^'J90r7v* ?@C~þ(!~RA]?7Q׆Pu_1BN{ ad!oVhWG <k4_o*N7o.&g kw+go{Wkd4gj7?|n,OFܴ*/uŢrQaڻ3¯Fk`Ѭ$vв\X`iOALymbLR6騝rV䜲N=DsJ>Kl"1I  ڨ<QE11Sꌜ-ط20YU^K?8PUfVL:ݟ39ZۡStq,3yGRA7Ƣ|Z],ĕ2DsynL3Q4ϗ`/sK)X4F0V몰R)&1=);Eygk]hfO k2n|阯Bv߼P|æG_'"& a;3$it%^Vژsijj#8x hvQUP*0cOUAmEA,6X+˖,CD!'\񠕦ʕKQ#&$eQu@&)dALEEkY'8#61"[ަIXw|Oښ)/b=|![*Ev71NԱhsKZ6DB[Qf;?`F3Dt@ @ 8cr%0Id*3CRJ.Sk/s 99՗]Hϣ3ґ 1`D:J g c6&IVWzkV7CuۤfZׯO}s|{=wĮ^tǮs Jȋ_{:  -86‚=  Gm8WqlJ Di[D,rIKU2*E䠳4$lL ='TJ0,vW;b=Mig@тi!⟔QB|i%G ~o$f1D5Yxt؉jv\AԸtQ$-c(>dX"ZC0Z".URA[2;EW6aP #b{碔)W\nA!^֫^VV.[VM(c7__iH8.F.Vk?ċ0OSQ]d+Gtތ/; 9*tFlIol7UeDֵI3/RJRXR@8jxZfZK%wϤ2DꢌPRH-?r^Na28ڟ<٧og?ъX=Cb m4. fw+Az?;oQG'+/|1r2]jNCgHf3f󬇇'z+BAwqW=+\<~]-UazƿvT^싷7يhb0'`Tj=7HXFӊ`a!ހ{1ó'0{:Y׍Zߍ6k),?"f;3Xgzhx5u]7-vW%uz]N %~Hذ8>K ۯԕ/t;ƍ2$Q2icr,{?O?OR?}^gg;uCCף뿿ûvW߼k|ل/Bu -|\'Z &] ݇q8:TS(7y'_Ěq_ x-$w&ClzŃõսsM|VO**4A$o-ʨWkp-HF͛C4b986wcy:h׳GJڤeD$}LӥHEg/5{dB"[IJH9_~S)t1̝ F9W%L~k.r{wчrf9?zYlvzwnTz j̈8 \3zu¦岅HX=n6m7 wP |%3/|` y -°ML&ɹ턦V$ZZE*G}2VюRQ-[dGP.R){o5FY z2m*B.5Ea]zASKu@Sh]T5ކ;M6q=bñud2w8#)hޗB*T4;dIޫ'/PU~+Op^DniYOd jOJDV59ZFc%O}Wk8+*u K鬍UӫmnMN6nJݰhogW]96KMx#_nrv힖j9E{ 7ͧ׿݌^y%z2̰ȈY&*\`d#R1bp tҢ_cѕKx[ѐTlz(aBrQ@K(W^ PHm{?қ-M26g[wImbD/*2^C6-ޟ=YUՓ? r8[*zIg|0U2@Kd%E*ieT"4-Yr1x؇C_^JVljh2±@_x[xpq<7BmڽIǦVzڰ;{ijN,kFsL*02d 5 }ftVvlD!tRԫ=,ƺ" !32PIMWL"YE eHı1E⠚\=ކs=9g5b{M-"EY;#7bIV}y탌>  T)!zJnI:TZDRCZ2X'9&dӄ)q$$ֳEm8FyL#I;SmZMJ6ghwvqgܮMZa\tΦ"2YY k5Q&(?E, Iy;vq[aoұ=t=C=|V[bCW5I#Ī_piE$] e?zd]:WNYήShǧqCdVL.mڴNؒiw.q|Ǡ!jmF BdadT#"e:L$ KP>K[$4'mQizcVt_ù@6%5nMtyon\J4=$3Y5'a&Ovu@x7z ` "AWxta5 {*0Վȸӂ 9 LV A v> p,TrXHƓP|dMQa@(+J;Eg׮/alӮRmpܗeRL tF`ʌ )+3m8="(@9m.&|fGc3BbĈeX_@\53'蓦l09nZxn`pJ)4mj.;45hAo&<Ƨ/M397o)qv)M7!i#u}hqWG콛G)5s pn8RM;!4{O)\"U8 =}꤯[;Ƚ,J;4oev(>HYRSy/^jך#Z?U~.tNTB4|۬/G:hC`du|q4Ggbt߄`jo?e'_9 J qX>/p-tb2XܢU<\{޺Y% K zyMѹ$,`z"SPM!m"G T}qijH|L*k~jpk`BXtd J(d\֩AK*=zX~|Xvas|%Z y2] xE9S)f7=jxQnW[Z-.d*(VhB 9!*T8bCJ D H띎mѴv!uo= (4zogw>iv&muzuZz0"]ko\7+ }IVeX `30(%EZԲƒ6Gwoa1|NdY'ר9QdKUcCPCy7Cχ`<;@Zd\3}gvshcgC!x>]wQAs 앜h&ٛˎ}ljNDy)yL͍1Z =8Y@O=Fh1tCC"N>tQk* *#g2]rթP.5ڬ!L!ﺡM mvAK&HɮH;1O :6[Dҵ˲&ȎVhՍv<:5I;J(UT.Zyr:I^): LNIs5Tq$%e[qG3ūOt٦Euѕ>T'G/s|:jd1Ƴm,tUR#qMLb#)P+*T5E+F8۱4XZ/ Xbj r񻃩5]ҾIeFM6h"6eRR-Sq5ǒBWGG/;[|IO#aZ¼9?…tˋi:zeK: Of/d1 |']i05ӣY d-<|lfnmWw m gN=!1Y/N|5;u[ť+ Ab6>b@ўayϖ0ygK*zw~6ћӅ?f߼~8@p$|rOfC?zs6~]+%?\ooUqv+;~GW6By.ۜls7c{;&bQvN4z7.@]wwo wYu=ϻ*ӻtM [qK@'X^@N^ׯ$#-0+߽:;:y2Ⱦ p5 Q8C7 g۔]>fK_R)]/pj_:c-kX G1To j.سQE상=ݻ'֏UȬ; )P}FTā5A1eҀ.p4e"ou<|_{K|> ?`CjꙥaIAf 'dt2.ׂ_g5~d":mړُ+ټo풷MlK_ۻ>v-y'F'2ѷ9KofvfWjggˠ VI=sի9=[ }, CQ^=Đ,i{ XG/V&v8Vx3~鿻A@7N=CQyGi̲+$aVw/*U[oъ�#kE{zy-oktPũf w{S`(C;?0u?//8#W7[\9BX|ɋSc\^ۮ].<^v{:]b4P\UuˀqEVQX>9B( s+#e&p>5?,YoŕO)oXI9Zksd8Pvj!hENg I{Ba8hYkIа K]v@ɉԟSr}u7aw+s99Ӓ.JE\,vW9&K@V'eF6r}C>2 TT]|ʹ2MCUunygo{iz?s#Wje/MUB&XXHE.YRѦXH[q#X gDɘ{W9'}0'IQtؙJՈsJJJŠsr-.^&4}kt SmD і|\bx>%,H+*Y!-Ni^i,I;MBP`U} `٨(tK|E+xnJ}(R.Qc &`ҙJ7bu@{ |duCގ0J2]0F36Xv)4}FnQ xj,v,* p ׂpr /5p;G285IaD9q,X.ڧ 2 a3ı&@EWpāN :2A,ag*3\eP`0T'K=jlIMzZ:dZ[ΪߏUhSJ6dAqR1*J0SK1c1)m'-li"\MUXIf v2AaAOS;šFW)@ѓ &\֕Aޠ\?,# 1d aͷ6 B)% " mG k-FC(d֜G5>8&u0L|G(  ),K=AI`gWT7!3QlځŊT d*3[(.HqB 'YV;Hl NUAN9CzCvPW.#{hD@5=)))X}-fZF#x^WFۻ{8OAQ8&OfJch, vM4AGwXD|p.4i@gmniz3p}ǬqDr"pBT(^NNZC!&D_ :ì.mgxtCjljV݅H|6-DD-xRo*Zȸ8:pĥ8<>m:D־+E$W[HVjWz` CNZœ ׻%*x$ D&rZed^Se8d"LtqXXi@ OӗMe :[*}x (nCfp6 ^uBI,蔀#^ in$;GljeR QNǍ5qMt&z <Ť:"Jt—94,'25۠k@)@`*D5Im2;i NmBX$;j- | ^:MȚ ;qcm 3i*ofWbeUy㠈YS.ZI a¤4,J 1Z4pY516ȉ+q`k>Zl?rN:N&)d4BYIxf!mJVn[oU ^Pڨ&JXE y=KpaU F@m󼶛r?wxmCY~ "n/n=f %=)*=`ck``uۿ4ߝTiQ[b6$󤑲Zh횠@ƴg 366"|IiFlĪCÁvSB^"Jt lM1Wds^ 7zTw-DIY$pw*T`=!u e, 3E3':SnC]c6u1l{4v"*TO[W \Ek ,NܨcwHn_eU Sw(鲨@cDI-tT#padzyNi,&W~ ڨlJecT(Ccsb;{Xgjƚ4DѪ &_G>fX ̀j)Bv 3 3H#OZm&DӷS~Vk FiULU|@Pqc:Xʙ9T!-+}nF _^6C>\Y[o+U:m\sʌ%R!)#C:MbAn26U ʔ{!O3)AqvF;}|Ap,=y1RUf"r*?.5 ȲɳvnjhPI!H2!<ݭ`<C]twWq+$;QA#dVES ^.Q5^fdƅaNDnv4ɲZ+7; $I 0}Tz2g t^HqٮpoOن\:AUemw@0\k:rC l t@(I H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@zH fB  W hm;F)(C$SDL4" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@O KH %AqXAh9i=F)""H Ű&" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D! #QZ`$Bp- Ht@0Ja @ k!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$"h3^lh3rj^_6 @w^Ŵl},^v6r@ztTd^nNfF́SnGŠϲ晏쥽Qk=z_-ۅHr2̧ge!.Yr1+r')zMW:DU&D\eaQn޹wGLro{Û4_$?R|ρs5 lzY"Rą` 1!A"N'} stcT$UuĂ{ C,WV|ʔ&%`&`٤b3Z . ˧KɶВY\-Y"8)Hrc&:C̞%;V/!%pRfIf.tNY7+{fI{ Z>a o)>+9Fdg8„h7&'+ԺֺQɭ銝%Z ZQL#Eq(\! |;U;Qi.%mImὂ ԌK)j91L֔c69+,Eەb;@A4scw쩹y{GwI=׉*({|*7`l'>^_j|Tk\' 0dc0F,zo~yILgkt냏J%6h8}|y7-csnq׽QKMJ{\&W}.qOFkbɔ)lu-ٮݲ e^3d ninAIrn $Ksvx'*;l (K9ķBK?iI@qKF&z@)PCHI Kv"7$_Eg.r { ¥(Ln>@t,#n(iyH\VJ<]~;[?`)6,n$(!0CV3Oál&P̀P.:y2{Z2n[=6Aa5p|).sr;4Hj>kȐ-8--R+lo)4][BTHl#t 7 ֎$||<)ˏup?˄}V[ 4`|am1O??} BiGXD`ZfsFSQN4oWÁ_T+Z5|Qgwk~VdоVZ3 mTK7$S0"}DlNQqK`;/<[EJ,6[gqM<"|fL)`RhlwR2p `0{߫7~z3j*eU"Mo sz{ ;.#*rRVWe)"{i #u+$G:x]栺zd| *MJ'.'^ބӲjϝ|(*%ة뼺L.?rq-fgϸsj^7/u=X.E|WLkWus fYRJKyJ]^wfLe.J>9#F}>jKGlζtZo4xNYvWUɸz^ӹ/2/?k~߿6vF~T*l,#)I\撗<az y6M]W ݩ,y,dbrz;C=bpr9>W3r6A/ ?Z(zvfu`@ ݸ-9.МNv?{q9x #Zm$\2e9_毁;:mg𺜽MosWsr} ;2e;CCvEVFNMhJD lOY)$o|x|7'$̣ŴF/uX>x-/Y -MyJ6Z[0JbUY iJI ieB4-!U^QhQS+|SVc+4-ܤ&:7YJy)r)Ħ\B1.**lʼی<# X``Nbl*n33r(v p&MKߝ>hյi]G]+E{D#*[q{4{8Q P6;F1YSmh,s95vF(@K._/M::B([:{?N3V?[_ޕVPDN+MJNN?_t=DmJ8i+sqG2pЪ9^` },d4fTWEqaHQjiY m9`̹ \N+WߤT*{N ^Uu%Z'(.+\~ g Y+m۱dt#7^5i1/%eZcGQnGLYJjlr%θQrj YPu7b2U|Sԗ+FR,]+* ,SԽ+h<815Icf{ƀT'JdbxG+n1"FK<Ve}|OEPwFMAUtƥPQEKdApB.g!J!w|5\(6j7?ꍆHG!w+ݽR\/f3%{L^]|{;C)sUm*hJiܦmZ2Cʈh&Nr}grK@0s: %]9HKt#;;01EMxA3(&kɑ]<Éexә T:j $Sv;u&hk=*{oF߻'JjbE%Wb")<%0LޚTӡ >\!1wvE#lQXTJW\ X왖$L\i4eŧp.lɎwlKwɰ Q҆m {1*F5|G#=yo9v=;bn,"[R{-_n1ه>&ɬ8aH_88tR`vP(SrpY ~˙}0O]5w`W8cWlq -Oėds^z*AmkTӘd<%y:ȿ'7~juC0b Q f;Ԋʃ˛tt'ۋG o<?}c ʷGCXyL4ʝ?E&<"^u?╥c8K,#`\Rp&` : 䯛cǘ5[Qt)㌲ǺH$\$Q.IHGZ$m.MlxC}ܽ՚K x} #S`-:펃 g?qw/0: ] Gy(x }~20Q]b݉N8jۊ֙N[j.MrG%B l#՚蒍r-_o2gEE$MSňd_,s'ЇBw`6hMSô,+T%P,{#"+d,s;@s]U)ۚVNU;pNf0Ą9(+]s~,9mèl0sQ֚g6ⰷ-=?10 [lqIK8Qc^ž>c%i>n=0U?3̗׻4 kGUѶöh>v4{lL/o㿗Mu3";@ /tWŇ7#]פ1)\q P|i_c2%ib`:Frх#WD7v0PC4u䊳-2dѠ.  /A}3GfU_k>CiN9gpnld̰WRƫ e 2c Pda5'9CE"`(i(evYC_:ˢ5i^s]4,~ZH.1hELQqω +y(=]dgj uY4ɾHMIb2.ήֲmtmL&rlK 1 ryR'PY*%{#PlY^ lt ,3$#0 fhZePW(sh٭X 3jLqj0D2#`X0Enm>][fcAPq~كxwW(Ԇr; syY[~ ܹ44P8^{zkV6c$Ӷ$dPibYHsƊ^l찜Y{Y.=JG8jښr/<%Y׮Gf֡ss3/H;|t{!l'64@܂_,g)=F.N^ݜeP6{qdsBd ıhIy` Q yQg ;e6nuplg^ 96.E&x6m_ " f6:Hi4?<+&/6uSo=Ͽ M@\VjSjjpsBS%J"}"Ac&-jR'T0_Y! -t1#vjFv2wqLx۶Or1h(VFwY`885M}w:F usp +}_[!ci~:'(:}ÎeعI[e]4W%R(EV{Z)[`2d>)RXt4`qfr#U2w :;#B {в_0("lxW?d6'oբlNAM73N-A7n]urFӎ׶HuQ:˙"M#@t02m}ޱVZiIQDٰq!AFUωѺԎ,Fx y[gfA. } c<@`;Th۹d2BmN13}:t/ Z dXa9Jf^6_XFiW>o%ZF_#6(K2dWq`D >N4bpWL,X) P!ǛN}Z{X ٮwQ*UօN3jpoCV=l"Qi )r-& w4AзIj.ݶq@De 2X} Xam*A46X1b#lH,hsw`X,X)p%FAmDެ~tDu2fqWy $9p$㼝cZglyjnz1 >QnuBWbn 2aզZ8*;EZޝ~ҧI׆A>,^0p$;$ڲ"Ͳm$y2Rpp 4|8,5\UM3mPXe.ku H{,S9@4<%BE(7z9f{O/wFHgf 󶜹Oڢ)7n=Dhϋ;ϔIG<3)!0>blJFC;GXg_.{>{k2|+SQ+EKr2:] B7 *B|5-Y(2FYn XMIKkcFھ67AQ`74&r$bެG,J-0M3f_72Tqx_- Y9}%6zeMYb %!ף/(];%Vi;`5pNJʜL&@@#\T9[gFJj$SMEsuv3*Bέ %ٻb+o(PԳ)Lo(Dkg.V~xٽOg}? L@uTE] ǂK'c2rxE^roGyʸܼL:wP9B$ 'D 1Q{w+ Zۚ˓uͦ!{H:E( 2YR}<ؑRo=gb8JhKDUJ"$r*L|vCP;M_vs^BmMHER?g}i MPZ'VWB@iҋBzimEp+\ %k@tCE|a@Ш\y6UΏ~3B¤Hx1 ϸ$0zGNObzE4!ʞ0P&fֆԦd y];Ȋʡv'=g=T)eA8zKiA+ˢ*9@]V(fĩ=`),+=W S0%ym6%Ck4RVH-xpjጉrvZW\ӾOǦkK)a(kg%X`?`T^Rѡ?νy^I2wC=#'{",m_ql>PfS:ci#3#RS\PŌZ6nA%Sw3+>U^ʫOOGQ8qwڿn8 ^4~3_[RmҐ >_dG[\0Nt:NٿU?^|? F:O"{W-YƳ#Z0$LeEv-<0s>2x̹);[ݭlE2wa|; [Ґ!{r]d__dp>tEV xIXOe).V?ϼwo׽UߎKAMhejhj8oN Ot[U8W_:U 8j pXjl,_7y\ s(w,ib*}>w}j =g|i"L[>faHssK&^Iƣ!J-kx 20[p9v3& 5:AhVk+-RC;zp@ۀ@hB yűyYhfr`oB;ɷ3N:KZ}bF̭_)tAIޥÿϚi^ŽϿj_mpx/?{OiBΕ@-AAol@.BDptVrZv )9zZ]m|z>wq.ue-  "`su2jhDsJ\[ɅPr<MrdVZF(` ,/tt[W¹+wn1xە3g,ZY(2G! 4ܡ1!lOZ 1,k2P"vͬP\YyPߎpEuBJFG%12RJ6㈫e@ٻƑ$W }\ ~0`=-d.ɲ,ɲ˿~lђ(ŇTi˕xPwIjqn#Xm,|b_p{jcV;$}LcD<II&@+[o%0rmC6N %P9%w H|ybpgD2v+z2 -_ưfU3 hʾoArjBD*H!#JpL1yHk2] j\ߕ-4 Ģcyb@p9Z:OI &bd:&ח?K%cwyx}3&;b TS0KuDփ&Q!`Uo,sF> Άs8Mn#JaV57D5È×yC>3R ,s/!*ʤ>+.}?doeK#r'v;`d7S.^iH!)S;+!Sm89rHUQ`aϛ[jILDZJACK0)VqyWgv ͐dxe: F*Q%+;AE *κ5^TrqS6P-`r9\wJqP2RqX,Q@x"gz%HOؐ#B~Fj~*Thpf! G)$v }G{^y\T0`3[ܝxVȺfZձP˰' l1@s6C'AQoY<0HƂ'兿h>eTAtd-2}R%OC=<)BY"3w]Eg N譬0r-X08xOSJ @M{e9y:T+SANt,ݙ}NH< ͘!̍d`whA,-ӐL9/aP {𧣔pN^ٍP4t j)(/;Fpds5ln䂂 |לk,%*Ǽs N*dkf|$.'!^s<[P>qx9[g2=M3ܤy_N#puT=nyLWh7O^D!RZwXHXk|X+d SX~'X.qUQ·UĈ{b,>GC$|"?"Aڂn3.^R :y~ӰF)BƅH*rpbsQ>R0erl9ZdűP,yafjBEzgjq-VkZLu?{)hq6ʫM2jVw6ǚn\(1a\{+ so (ǧ9p# :F6T!ւE S yR)+\띄ʓ4rE!VfJqZv;*Tu#ٲf ,?FfFL@df#HљɒBvRmNWytrY21iT.3۫6 >IT UZsՎӵ`N{)/ UPFw橐QĢ&`koV bPM4!:i2,Ǧ(q|` ":ȁjIsp aQdJ=q Qa.AZ}P D'Ců4]Vz[Vi蒺M4}-TʥBư/ʗ[a[v Nm9ViO<Ң[.hy3A'aVqs-I)bp"D6^vwy7R[갔t7MQswnZ;-l Q|.Rl"̃4[ްx\ dc"2+sD[gxD٘CTf+DC  eٻksQ! aΛ>˜, ˲(q{yWWŬ~^HdGe!OXuX$zN?~g!{.zVtPFe{qum\Y0Kh2iqv?u rSA[C9]ƞ"׾ $jY-id:%:#-x댲"B 0*vฦjyGA)8T#ۆ\!&nZWv7G9=2.D8m#_Sd/:o V$de>лpd<. I"gz%J/%`c9Ñ4ׇvr>H> l"!>94BɳVVNp:.ۣ^|f|1g6)*%&KI1c*!÷RDBf%#=>MBq`iLLDKRoQ8JU II}"=8bvVfmU-g va<[UZ-sIF>򎉺`Pa* h_VB;N8aJǫԆ_4`u{-*} Fh-{Jb:H2.\z3Q$/3 }"E2<@8Lw} Y>}֋J5a2 et8_T1wO{6 7 \4ѨB XW*9ᘾ2ul R2uyeh9V JOij$<8,[ŔJ**y!Yy=d%Ng5sdx٦DH GZ$gΨ{F;b,'!CqGKԜZu"W'F܋Og_۬pC7J'E6`kxsk#ԊQKm>q@b,ڞ%vpr'B%n9;߀22P"PVԦȵV HVAَuAq7D1E[?׼Y@q6Ղ? +Z$N= m*ŽoFȼK /0N٧oKhoG=>xGc\djuznT~yw.UvC;3ujLFBB}U_ - LHD2/O KYN- S5T24yFPؔ8ump%¶;7XNd.ˋ(1a\{Mmy<#v/fnлh##p-̧;>updY&Pl%jf Ͳ {$a>[CYr7ڰpmnMAQ{r!ʳ"q6ɿdGƒIL^oDSS+??sc=#P/raEg|t|{?!ԍmb(Wrp-^A_$HO$612^M[z9k=~¢=g_lB6sy_,?r?|+eV r:Quj2]樱- [ޔp]6KdeAdsū^k]?$e|kEt`c|WZMwtX'[h]3cw8!i2UT3"HGGnu=q|b)g K/fO`@(Iv?$}LgDdFs>#dј@PLVT(l" }[ŎdPggTg@dQ[;(.k6]t4YSbol< r;ZGsbk~|2=i_FWyrDMPFdȗ۴SLNxfi H+r GQW Ye!ڳFHZ{B-Gly!%(DW&+_/c8%JHf&M&ݠR c);yNKX&:;26J^tw 3z3*\.}Nx nt18Nm' -.9}#ŹpPEߐMu|uaVȸS9}>5 (J=%WboW@kTgIQEǸ $dl2Ŭs:s(إkL."Q,2SJ{R:HD![1ИyW^x${`31+-D$+4ʔf@ 8TE"WΓ_e˽ m&VFxc/!q`/ 9>w4-ngU<H6;3Vq)8ܚ}v{`\Jdu)z'p\Vh4wk`=Li%(jԿ5!Dr+_ ZٿD=0%Qw`gN&өGbǒ*׾U`&Dй{}-=ݏp/R~K7J:ڴ<;v8 *U$ƽ8%e %3C luPvBpij:|P=4gXj޹q᪏C ϼu`eq]]moSNa = TPKgmO 3^6ݯtn :IHPr JoE쌣,M<6N,He\woIWj<ڦIɨh+d a<a{鹼 S#i| GFqQ1Op9(,#5ǚ<>q|3~L!6tuyb0*:]z1w֋YqC8nSZKAyd.))9C;.Ǝ6vΕ]4:oScD8;ə2;CB'Ab qVwe6-0WOkv<5v<돽l_!wݰ}_T-%º"*"R+Sf{3?]ҎY !)77u7DT 'h,as5%D"2XHP& ;/Zl y)eM]-N$S]?;v^ww[d R-?y*B´hGND$b Kl҄/+GЯ_19Ըd 0WQ9餪Fp'a:LՏA<!,p=?_VCy}uez Gu\,惫'|5r(Mߋa>ʾs`UѬ/MߌG7Mi9O%A*\E*:|OДTg D~eo s&]Z#XܪR*OܩW)î)5$v^Ce$-c{9]xZ09AZ,'2N_ԃzpO3u6-,v Iڵ'wxsm"H)u'Ph 4~o|8zJ"B4ǝW!L Ztm 8.D=4[ۗ®\>?yeV j7s7ᮍqG&8]/?Hͭ\=|MK3v`VUOZ~k3Ќ[Xi:"ۃ$qȬ Uae p40CI#D#L!oן0;BGf&_3 8B}|N'Yג7%w>tz gR8 NNDn3cEXq`KZk~6b3;%Xp.q8{qPfWC.ّJC0!drȎA^`ifC.PPt*t0:EB(A, L((S8/|kN]~+:5L^ONH/pITf@IT\Ѭ SӬV'#Xs%.T^' W1.GRrκc܅޶kXRݶ%OT/,vTHueB?N*O,փ8ɬR+Id:im+GT[OO}ӌu?wh\5GaNXn]- buɣNCWȇ8Ei呲zwn.%9J<԰Faunƴʷ}R5C+ێiGcy5+a9pL|vl+i VpgH^$"2 3ˈSJR J.ڍPb3$UXRpwbinW<{LqYL,F/k qH%߿-T䪘 )r*OV{5BүJiFkuM }w);Z mԷRd%fVGJHw8Ri?~"Xs7ϟlq\A.nDSxX @<϶ӷyn9RQ<*\@'eN7Q.xSj\7,$ś@% Cpݑ?゠@}iD-$A7 K>c٘<㲌^n0U*qATX,b8%L Q,'9!dQ)W`dRI#2;sUkm ;R0`;I?~RMCJ/j,7LT6ӵ0V"8BoJu9~ $>Rc.+<)*{GB<$…YMid2 MJyctHQHPDٽ{Y}V_5"jMp4#!8i2(qL^fWnA× ?~*Vx0Οq~T#fWNshtr?c aEx_ѐEm쫽wm&>Hf9h&dALX$!-Qn$mKG[k+]o %x+zz˵eqjsϫTfp } N5DBԢekR ͑ږ?ϫŰ ps :>Ev8f- cdozጓpy?72OӍbMmPqb\t'u.V(Dׂ8(qlX4I /vk` R`OɅTh&\]o}5SQ Kti K%1]f%I!,3AE7E(9#3y8΢5;I.RE;.!3Tua0%#" bIX4b"< .E,Bfi .Oʸ +*NuWtd\$Fivu8֋FByp'2LUSΤṔĘ(Q *00RG 0VD17#oͰ]ͤ\'o^-:t^?yUxPap`dOEN BX|2-Hh`Rb ]Hcl\Z*c4*2`"pr+y_0sLˏq?`>Xyf;.?ߑyzB:ŝ4yR_ Yqc e&RL-Ð,Ӱēz}ay.mrdAIZLH%#exEI" /c8jP > `JH{ey aEv䎃͜?t2.'|Z3_^-_Mu11lCB:U C܃Jo\_?xȯ`J3r]}%SPQA_.jZ᫩VjW+,H#BdM$q,©(p^IxF()U2!I9G8Nr<ˌ4? pAtbKW!fz=d| V3˶; v0̼6{W#wLIa&Nb)iz^=F 0֣`g:L,ݹJ`cκ~cܙ3)ұ;4s@7_5|%+Sбqƙq\_`%"Q%IN'Um00(Y47Vc+]x 'BNJp!5qP V ?-y!D_mӶH,-XMr<9G!8x97QFqA, iTeNj)g]XyAWtv԰&+֑ja_]c֕|'b wަQS!*0o)p@u\S^^r}9b;(w'k .G|XoB0^"ոXp66swMߞ[tx"u$Fo0^vvK"=I># CtXlyld~48`'H8旝H}t˘Wjb_M}t`p&K3iɢB4Ä#+' 1E$RfLf:G G x2 !|z^>Sa68 'L(=;<(ՙޅTSF+m#I f0^;h`3}XyJ(&jL(f* XUQ_DdӇlߗ}_RK5WUA$!Ii0ȼQf^HjĬP),h~ʣ)F)Ws6Ph0,ض`1\"Q81|S#/¼>*)x 3$ Ț6RX1 ɼ!|>$ }=G (H=R8PUxm>O) HI.Qd Գֵќ;0U*g1QDs-v4pT 8TD9X Ez1|'Ө+a<0|4)T 0mDbOW46ruQJU<g7U0ڎiOrTl$DuՈy*X F A Y蜁h[ 133<`+Jb<:׳j)O)ӧoz"uU8rh$,:R/F86z1v 0 Ba.4 WѕvX<&Hft*l5-AT ¢X\qJww| ԸT&RF'Q#+젱CW( ƥuer6;ߡzA2k ˎDeqiA<t4Z8Pcà'S"jඉ41.N)%7ugMl9^x> RX ^̳FM)g ǐ&*~̪?/PKR$B"-EFEn*p+U<is?`ᔪjOZ64))]~wmd9/u/0>D&:@o&5]U9Tho/Cr=?fa|̀7?Ya 'X>3<>-$~(G!HRb06b6\9FJSB/EϦT@ߋH;gk]\ܷs;h/IY<8| 5 ћ$C%@S" "xɑpjEc=g˹^uHpfZusﲇ6| SKW#N]C(Ny] vcՊrm/P`~o7}/~=T*E:(N&!R:ތ_iH*|sIgR@+{c jC+k (Lk C0SDur5+$"u["WBc?J8x# p"+!D(IQK8R8IXgfϽU5,80 |imZߦKUYTtٮx #xʟU)jEMF-Ꮣ?7yo'9~C/kwo>E?Zb1$802&{:25vQ,Vii乓~/ 66gRd63iH /Ҝ9Y9[ڴּіFyuW`I!D Xj/E|*cY1ZvlKQiTA_Ak.4gq Pfz]<(ĐiyBREK9 3lGq1 *pH)/D:M9RqV$, LTý,]AQ\)w+?&[P͸6楬ѡO$2gӰE7QnbٖMv A쑈/,=mrƦFBGZ"]j&b.#!D<h<=]Ģ7dc[KdVpߪѳގE$nօ;h՛\ZG҉d\Y:@d!:b 4FTm3Ɠ=0_Gf~^-zu~㾞owr+^2,~>Q3Q:-:q!VIS$F>P``|S* e;h<(͌I0 XXsծ{zx|9+ɣK:Y\2y19qOղ;hE73DIBx\H$F`yK[cDub젱n{P$Vp{q_MЩ=IJCI~*^V~BDUZiԎkTjd,5r$g=qj淍uatY Q%g<UoʡUצzv^^ߧt}ٽq9DJ.ռ0&xĤa~V4>L| gR9^JW-dϡJ}oр)v Hr!KYyvT S"45\;M{X曲w qx$a4nvKiyn7x, 4dv9ߥ>Y>.-ў K<~ FL pLHc9Gw} /*q`,*!! :j*%2n9Cwt{C(єlؑH)s&')xn:R@EvkjT|21D* (5h+qBoXZ`<5 v\:Z~NЫ$qV^ 0Sրe)lGBRG%'a9F˱,n)Y"ǩoAh5OgsSN}Ee8ps7:·!?ןYqyCp!$%y1c7gH硈9>dGݎGJv\4+dl:N#C__L?sfG=BK"1+,~NysjpAH2h FT|6 E6b`eqseE8vy- 7j=?c,{(Bcquj7##N#cd;xUgrc,y3'fFK4;-'ᢍf8%粸)`~Ď[0܋OY|\7[~%b/is _dUz:L7p)2p4٣bW,1t凿`,qP?:Dž>`^ny1At'JVUel~~%yx89>`ROۦUU϶T=q640:n`co9p|4 4ϫ^8u> 52-ZAED7U!y{Ki"/j|pDZ\Jl,E'zm|-?8F{6ӂE -&ՖG[V췜hXuHmx"~T emp_~1/gv^ևwy݊N4yg~lEk.E"v,] _3Fi%AI G`u)0>x=tf623] `>FggWzNOwi A doj]Dt&vCL[ZG_IJη9|ڍK0-pAF* 3s:ʘT'~@+M>̫zs,>[nC%6)`g9+QB`om%?`!Y``!vx}+/;߲Q;cc\qbBx]έ$*R2TJþA!>uu4'r+1GNB}Ho'+Jw9xW~㏠dF BȌ&wg1ZN"QYE? )TmunEzPv A :+pl< +KEOW|6}EAB 8F%NUI/f?XuԄ b蛲¬:A垒HegY?!X0Um`Հ]V}aQc$ꛁL(&W;"Ы 懫}T3_/R,P!v\۪%%$TZv"i)-óU= *f 0da\&c,~ϣfiEF)2з&a}}=O3)FLHm8Hڷw'MD9^}gH (*Uօv5>/(uN 0Tjt\.]{a)el`_² IY }8~Fl>(~h-8*=o i9W;iy-uqI%$hqW0ADmI8&Pᬶͻ_~]_e}xGjtS?+wg~h?r >bvm=tQ{-$o>Ԅ@g!y헐2]Gۥځ/h-ę?rN>/oWWoӯONdxه|81  5j6)Zo]˞j k}g_zflDܓMV ' , }xk /YPqF>{ P~J[kc4SfhAdQJ~6g8#J6|݀& 0U\E/}mfi[=lϯ'=,y!e)n6i<B,4~!/TJE`xo,h޵q,B%9AZ[v 8 / J).eY^ޖ˥֬dɝٝ﫩*ǯ SIߓbQщ%XXZKVi6 rc4?B@?]~#zg?.bqm6B56_kE=^<8/5sUͽ`s]T"*&*'v8^z1~log&38X>>&&&:bG~۰y:}?3=l dJOyz48W4΍@h^}\Lx 78؏Ӆ>|| @y|:qX [|F$=a3n"۾t>HHl&Oa3xc+>2s 0'9Q쌯?|Oq(1n{x7Mt(/NoOޚB~Be_7Zbϛx^< ù:)#/ܼ~=j9ϿZy﯎}mߟl4' atۿ=Jb6z9ww]<|"`6~sxFW#IN5HӳN\둯2E6E21D?dﳏji}+c№??Xl-ł&vwE~gFpfd ]D>5glȝGu^نG:0f.TڐS6 XL+o)vn'*>eߘ"Zz9IIO½1Ge'Bʱ7&Z};ү~sjcvmMU0!ĊcfnjM>u&g!\a[wQClSrFԠ$lN RIr)98EM<=Y>iO~F#2(28=eWU7%l`H*\EVըyXAE)-Ւ͐p2kXS /& n׭>§N1v1x.m>WS4'U+އФNLJ9;%NJTI׀(]_,{xC(} \Kݜvؔv5X^\m[-v A]T]`$5uJ+'P]Wǭ QĕIvxbe_OO^Ph' TRGukvA;nxŬeǂ2됕 iur{̭ e).* e]\dՌXJIA$'>0_Hd}Xg(pf"bO.!lth  C#\E-FoPVY=l Js٫'H,$K!3Z]Wb}C']S U݆o:Lt: cu˪=`؉5%ꪮ mI-^ӒcԠj5wf5ZI #=*9wZ8aD(ɴpQlTV:TئpN(EޫwVXHVZāWo个;.\*o/_mR?>=^lJΫwL+SLl4Eq1%=[ JଣɣD!K>R~|C +)*(&˯^Ne xz ,(ۧjޝuCv}ux]&t.1Q\ŪIX[zDȨ&y g̢)VY!qFj$]FI;/4]I]BiY.2r!qPP)J"c d7"7$f^۴-J&:(Z *s삚] 5qAHSoJ9SN5"lAq ;Lzw9M81o<Qc -!<Ǘ;nƂOl Xkgọ%EGB{cTgjf,.Gk “SQKM=CbS (y3ԋmh3kZpG"7NT(: "E NZkݾ9fyNzGQG!D h=xaYag9VtүRY:6NW# ~o,?fUf x28~]nw3 d5>9yC Ie:yfAvb(W}0:Dx>Q> Λ,4=c~)mymUGR7o;?xGѮ9RD*t:Թ C-*5q3u.'G0#sb}#a_88id~TTtd}m6/u3:ÑMp}Fb#6#SgOY'h=9*cU׋e鐓K|a\e_a]O^ ؄x_vlNФԋOtkMKlKo䏰uw-H`9+kG(j|mCv‰sJ{@ے-gk?JP`w֎S/lG^QHG)<ֵySz[: Nco*TK)HOROE^hvLUuA$h,;'^nlT'`3f"3c %zfk{LivFv CaŒ97G-K=mocOr5r  nXė?C̞8?)`>:b"qᖠ"tƧ1}!`6髝*zq=3 ZǾKYNѲ79k{l4EwO3)Hy8)8rYrRJO~S, el]|oNi9hel{ͫ__4t> y}j-<k=ۓ8alLu:gfpJ%D[Nt} ?eu,b,ONl= VVDihyFˮcԕ\}pT,DLƟϫxuNo7__q_9|5Wjҫ3=!,)9>_,vD C={XVݻ~8= n8 j׽xcV{x?Z?3XCQ?\ƶ<\n|w]3jvͨ5f׬C*{;( +GR:뛺Mw[rl`7t}]F-MДlBCXsIb?{׶ƭll\?ckW Tbqg6%C ҸX ÙFZ PI umar @š]f7شNhs68T/W>(Ev[uBjTPڭ:1٥u uInׯ޷oԨF- 5jZ@(95wSa@S*} [q0**T btuV RJ""du6#3q$MSaѸ;M^ٌ컪jED5NJTD5NJT'%J-<[pOxmbHw <*"J|\BeO9W0s1wUi 5 C26h <rc#DnKK@tM8ےo~ };u->U^Xv#FT(X:`#eQ`+2$֔ !A5c v@MG8-#g%BۈM.hATu_iA%yLtMmtD;^cc~uK~'hsSWW\iGR@+*y  fؔ1r0X@V{0|.E|`E996 -Mٸ|)Dd5x)KC%#ð$cʺ!6([˵js#>a&iV9Dn lQ`SjT6gɅ%)렜O&55rXa"]gs?>eWEd&5d;x}6pylh{E7`SZ2U8s5Up=֏c[.Va ry 9IO3y=x<13پ7Z2zbO9zѻ5p'E}#FOgK:=綽{Vw=o3- CSۍ`t8n'_A~ AP@K,̟t'րc-O@떾g@@OX$ۮzq|8kݗG<(,Fݻߢi^Ks5YD6ؠ:h>%  lׄJo}=}YDKhYDK.h^ j{_N}  $)>|I'9؋+v7m=;3Ԍ8X'yǚ ~|bmv I6c6[s֤! 75MI53߂٩(&}~ͤˆ/L.x;qvkgM<aIىi.wON|,eG ߓQez`+略TxT6F` 0){юUG P]PΜuqђڊS:&ǁEl9MћuE@N#t%aXk@ƨ!a65?iYWέqGdPhضɳaaE NpIۣlj[z'ݙl{~ƣy"~ߴy>QAh"&0Jc-ɉS@};kP`B'1mexzCVml }N&-9O"9qϠGaްʬ;>|wh8VO 1k!GlyEGn<V5v:4L%^iN8qOp`Cg5%`>zAil"M-ҥhɍ%G tǛ TBX?I!0zT6Olz욀o6.}u%Aˀ{ r$AȊ<5͖W;69qq &5)D- q.ƕڭNX?KB벮=7瀊%{yE6X\2zv+o߬^/Nd3zQn Ū'VEn,g[WE@jg bgUn@DG%C)^ӖjVO*"*!^X^g94*).fq?iKZL6iFMF?DJy\`LVE2K%(5-KƉx8#)Vp.!ٶƠ]uR^q9xgtЩPp%l&+@5ޫsQ-Tn1n`x&pvKGUyܰS3wϋlsCuKA=6]IoYU*JlaҽgCU1VapgOISZ߻$F۪0d{+)% ĭ8Τz>dF˄dx/IȄ$g)"]{1+j1+jvY9Řu2d1`TzY*\:qo+ˣXSeQ kEv/ ⼂;)0̪*CSx[[!R"9),$='qs }mbpң7I"Cu*}ǖ:pSk 㢂 -MYRciMF[ZQLjqwrE[0w:lo+<;P/cC۷ Č(b A5H+6k - ;I%utJ%K0HukK^"8NuT;2ٗP׫۬P6jE #L1 w3$Ŧ ڔuͤa&EjT|hb",%x$e/wxQO&{Z@)1Uל\Ebc'-&g*d1|tu* j ,mx}xQeX/x'hl{n>~;ە߼~mFj;v̝2X:h߱[ :nd5<723,L0mK_/FP 9~Opj))W5RkW,ɸ؅,sճe=?(bܻ1STD`rM#)PY>#[' Np0oD=Y .0'Ф 1(JՓL): W B#![1b) @m cb03 @sJߦ=5LY.slBV@g3ba@W Q_9'XWk*X=E/0Pȭ9s÷xWCڤaYgMb{6A{Đmːy 8/xѻgdXiĬ7x*X -5՝{?N{[&5&HOK+pU^/P6*ͣԏ!1މ{4d2qЬwȈ5=ðˢ`re u1U8"Q}/ExPd(fq葤SBbѷcQĨ2{@<)$USr. DE^ 99Wy d:Niv^ɸhi1:Ժ8;y`)t`A 3=wGFu͂pO7P6c:v5op)..*% E݋G혔ƬI_r0S\ץ Q&i EW;j?lI!ˆX exJ 2YӃh\U\-XC9qHr:?z2;*iQ䢕BMXI7gښ8v_I彏4U)I1ʲ(Z1iT$,%r\岨ngи|@ݐC biu\Ym jW>(XY:e$,?߶7'Q]*$2&[Nn}L XSWCRc`2)3됹/ CZeg ˫1WKR1]EѯLVEHEefI- .pA, zx~*_>Psaд)A)`gԞtbA179BI5r)Qbbn-9%mǨ!SMҷnͷtٶ+WzsYN/9^[56)vz& 93-1^u%080XR9TkzI۝pe@f|vNDu*aTk{A>pИѽq2[!/qGF9뿷GA טK4W&I8$&n 7)֑~F@ CUA6%6d5 x(>qZCDT]zWVA]?3*jF^1*;HH她ήo-Ǥ4yx_Pj KMedrjvE1lqʈmacorhQ,ApTO(R)$j9{sV\EZ" V#(g~yy[WcQ TJXR]Qj Ujr#dJ=]3KXHX7'GQXK JQ6Gy>!ٝwPMRB4 /=u6mj9We ڈ9'1160DpѪ8 q1س`U2*4|}-S.Lzb&2AQeVc#P@`.A\rJEP+"ubS,8<=Z 1>jdts753ˈ\3QsuN/}JhA%(L)*yY%]! 1Fĭ{:e?g&y9pnPfn #߿oywPn!kpglyVts] (! [S\8Hw*)@JQV<&j.%*U$vGT݆:ZJ-(Gў آG$HTAYf-Xض*5J=X\{ ?(h.d3SbʿNG?@wI>McDDhw#dWbL$Lj'Ճͣ=C3쟳ْCwIuws_ÄC(|]ypnp* νMWkZ5j0AYp%8v?|"SceQ6?nm Z7nvg7; 9z7[jF:r,r!Z0Y(vAfq0kC xk3A֮YNz]\?ҋ k6+e?V^WO>|_8jOa?l?!i"z]{%3#DY'_뵏[1̴fI((#Ly&Z 8zk ETB$tu:WsNUrA⾈&ny3۠|9Zy dh%3Ɠ#촲{]z7<.㨤t.vp++sS=ɍ=Z[B~t:yJK%^S>7+.V 4Fz}27Tj<~||go(&v0 Py1:{t)(dӉhB^4 |&{ͫc'-oǼ~0uR m^|q a aa=~7;E> װ~c`ַ?0ĺw>bi}u,teuV)'aEv?㐝-+h9P8uf.s- 7J"1@}|_yKamo! '>~;i]qyՇpt+~Q-ZqhaEJ7&[:Ÿwo蟻ZQ_ҋq;@٪"Z8D|+^Q/:>~SL0U=GW[0zҥpY`Y~rEQa{YT Xd7ץϲ,O>:ĵ]SeG_PK5m1P+rn؃fcDԑ[Z1ؽQj&l<x8K.*O?&BSŽT3ZګI%uC66 !@%!4GށRPr yVS7aًsAk;M=B#CT\A0[ǖ\z%6j h~n+h=8_Ґ. Eqq&˸u^!3?/ o-q\&vL|_GcKBf,5¹*Cf.x>H 'Uuf+e\[m*.uK U[IPucVKs=ޯ\\q?&կ6dH CT4_\6Z2b4 ?_p1BTEƜ^%XD`,JЫj-5FJexդp.>V tA 2gHzL"H8LZ~=|#+XQgQ~s6jC $o6N~xgB}+xwdn>|}S>v,GG^@O`&P +U̅LeC=ȥNxC3Mq0"9JIjJ$~-Q qY,m)=*V[rkWN N̈́:G-!g.%غ3zj3EWSPS;~/RL2ZT@L t[ XUQl]/SA lv9Ԥr1;qQ4SɆ$>R(ǒ SQ=[=JzRBRF,l<OYv')1?n:?nb{[Nҍ~ټpaj5 9]Mfݎ{Iu2$?ˮ^Օԟ*4.ĥF]퐏1˹(X" ry__8gS#we3Cg XBГcvC'sKCD!~gc^)D_zѢb`fԮ& rQO xr?Su{_=ܤ>WmVݖ/uȀ1ț׌zdD!:kNJd`S#g;%b@ώY1nZ.ܐuӭ~_"yxeI>:8hP] P"lUի$=OM?,8 ei&鱰ts_ClmU>Lٞl?s(&63V`T 8 Ng;%'8m@m2O[ r?P j -{T꫱7FxPJ(Xӭ砷@͎#^Q#ay_.ㄜʙ?t4&ܗQ刌3  lptB3tcj_PN?{WF/w{w퐬"Ye` &\ _O+-_QԒ%dKV2HVSz٪>-1OZ4yhC j$ .CP>#]l yNi1?iʫ v4bpIъZ讨T&<CB0Q{HP7-@*`w]HVAuc nu=瞚2\ NW^ϩI΁S3-&|lSsߤbh0mK}}9L7WnErbdTkLWRhl`HYݳOAoUzXȎoŶO M^肴*H?y#…> ȇ&q}>zC˳pLp뼞<[rʢbN6/oнev0oE|ۑ2,ۓ6YȎor6h/Dh~lcJEk4Kun"|FY&:n}<p1u r6M'ʹcU._[i "VM_> e^YwoNǘLGfuź_ObL=*0fhuK,2cfa!;%2ݾ{wd嗟޸;']{pNڿfiKWY'PZ~䃟2?ʜt6Uŗ] @hzApkp'ߧ!8-s1kp573T@F%S!KjkoޤM#]M)Ȫ@v 1#-&߈^U ~D L:qCPtAdXNrl>+[Z*{VYynjK҇& _z$ |ԓW^Hh|1MBv|3GGsN}1VkQ wK 'nBHy>jQχ>$ɷoߞXqZsIU『 ֞x9CRkc: ud֛CQ8 V"cqe̪7IšoWݩ4Nzs[/,«Nd9|Be(z)pzMb=Kvq8ٺkBxу[Ͽh4>) u(ud h3]Dq0j Y'[^P.8z֍oGN:dkR:1ׅyhګښ GPj+NAn (4!gjvɾS;cˌ77Ƙ9R;v`j׊n ?t׊E遳]+Z;Jm><(>0Kl*pGju{|yPfp<1RËC/Ou| DgEsPQ!֫~ !|J*ŷTɢ="FpH|Ɍ-D"ET |#vnG]O w @VEk,̡`dԮ_IݞS;7pHN0ɃN̋ݒ+J1cRv́=9UqBG[YRjԮߢ1Jډ!Q;d;1/#@vGg(n. acq91yYC c*;$j19@'sNЎZMnZM_&۫VjzjZgCG;b k)`eU" PLdYmwډ8ȠR0bp!vPRΒNRn|#v{N,;,]lD5;1OJ->IvBDk)*LܑqIR9jԊJ[tڇfߩzi|0@QjmQcC |pqq9ݻͨ8/ꇿ6zY] =dfTKfx#8cѴ|[7=ƻxe]UqXk^ -VJj փZM\@Xv[8<;֏U1ARnx ZHlZ ܅y..2?.Gs:].e67mC7k٫<‧ vYOU3yY^ڠ/)aȚ'H7yE9gWd-saJA<22խ_w:M-^q d)!-C V[2`3Kyp1TTG_M'8IJ̗;LQϸ=#)_/E eAメ]uBƆca|(C˟@Kۣ-Ayo,bJ #Ao:q/cvBloxQ&Mv͝^\pxQ֟&l$[φqGeHΕ&s\euTV 2aldrt]ySbȣI T b+]02he5iBnx?Y_d؄NM9ƅ.tW9ro9 byd1.yHy`/.zlQ60ɮ4Bȋ,6QM-d,nafKMS5y)03=`})ȳqMQY1i|%dVd3-T+8/Zu*8~] ΫAMVp^mڥ17^^΃$2oa;bD-d ne@qK2P:c[^!{@ pld&f3XC=5si5lO-T.||9+W=a虾Zsۥ>}c!m5>])9CNb98[~zZ୧t(C>_}οԃ:p-;9غplG&JA8r3ʋa=y;M\[jn|>z}_Nb9g/ϵz̉o+Xͅ-n*Ui2lUf}(y yXCRoɕK ˋb4nσ!xf/`VȲU|Į5yr4][3Tكإ%}W~F:!9ayA4h>uz#Dy"N2`۳iS*P#1*P+,-- kEi>7rpT8 ]? ;+ /X4el^>_^0A[/^|*yO)c[[VMtӾN%#礈eEjpP,M;AV%Euc(8L66n9I9ي2qͮcB'v3"o>:9LF$֟|R܋u)Hs}: A[IL뾲lܬ? صm @ e) y"ѧ%3#5<@ódkC)Z-Ԫ"w680u}W{=UTk R3/ؒu|5tk&s_OS')|jk"sm& Sۦ̹ĦZy҃,1A6_P?[4g3o_\ PU9nDǪ 2?olYǧDKnA86ִ#[T"Lo=~- s^K+, 2^.gf$) %w}JŻ>]RO]m "DTP^WyC0h WyT̔6NKdꅒ}|}mٻ$t{j. Ś"iۋ߯z %Q3HJO/->zBLQg "u$?^/ʋW ƒizdZZXP)~@Yuן JX%^s0m0CO3C'{x4?(U+s hiP-rh0Bω,R< US+@lIfDD 倊THmR/)@%cɦsjdSQթK0cY#gSm]Ek@MK:0cCa=SJMWahɮm:΋r=0.ļD݊HL>sPH*|,uxқr0kuS啀*dQ]ѵZ"ߨdX>X.> Oz7?xwMrׯq&IÕNe73S3E/dyU#1 )s\#Sޭ M9DYp#W -Ӎ%>N<=w06qG1«Y !qk `ύst{WİJ -65Q)q ct*0Sa,LB.KT @Q%gcاd F7@ MCI5g'6/OZwf fOď>!ɻ0Uc:gǾJ )W!*X\S,qitU nb+<3:,Kq a`V2)tFqM?Y zBT=]V`I"M[.a=]U.ߤ@oG/PJ1@<&P~)[uCSE OmB,sK5*cNXD{ʓIȰgE/?lC/_{Ni$Q4rp_T|h48W2.&BJn4S72X&Ƙ`1 fDIM=&ob8WB*4:*6:(^8g+Ҍ 8DFj"2?6P5fX0ŅG,V6Ȑo:Ԙ<7!G3BP;!G")SJpcR&4&QZ=QPz3&VΗKG>j>4TzwSݚ)'w\m(j+*~; QTfwf$0ƻ@# Ҭk7pԬR#5%if )^*\K#j/N!Y/{\J|HCN[c#EtTDƘXf{ӁОn:tpxT]"&h?sE`'}I3Θޕy*DpHT_QL]᫤w2 J|&Ii?{4 `4H63`{o/ٸ, C3~, Vd1-Sߚď7G<6,q &6,d> 9$$e{MD6bxst<y;z:hRXZ+,?&aeї?07#cߘ~ cmxqg/?s ϒ+aRrϣN"]h18ө: {"2I s$Z.QZZT)wҜJ& X\|5@Ail- P\%%E_M| }qѥuRoOL$-jmQ%bh6bspG7Ɛ+[Yj,#@g4a Y/Vy 0 @-CrcqbR ,*:d2 W; [4'W<>_,Xz(PG,! mW%@Z!n[3oT$AET$AERlP[IDe jg)]0cL:m^r&j#&GaL:/lu4SlFlĩ~H-E0 *=<J+1(p`"bEYf1hd֌XPjT Nų/hl1XĘ96N*s (Hcy1#ߘ1$ꙇDygPcS5 HUǗې2+cpkp%vJʴ31N0 >&6:U05ACUIe&=] CE㖑{F]*PApwpZF]f!r;oz{^N">fcn'Al6 3DafJF?Uⓙ^avovV A g4r<6>;ڀ]80A&Ɗp]F0\3NQ@< (n x51κx%T| ;QUqU?D=c|Ҳ1†:;BNA A8A(NPt NctNq5;ˊ<Λmpֱ-=TV< Xaː2$e҆"rտW.aV\SeW+s޿lwޡMz-(!NMwwX.q2MBt YۢK^|dFCb:yEڻ|< &a`}ntoN hPli) +g֬oMW( 3|1õȈ=Yߊ+8RӬ跫G5!jQ ˥P K'@s:H:\. ~\ߋ?Ͽכ5&,z>' M&Xa.}{nGnh/:=}kP^ԟLJaRaI(N>s J3p"^i-֊^\Ʃ: 7DwzUHQ xkfz>5OB!ƀuVT}vT)wKef̆"b^: >]:1ǘOk0b hRHR8 iHEQ_Hj-"D`0E:%'#t .T/Z^lw|zY85``;/9GwmmI|n?o b$N բP2I[͛)Rl>dNl;|uE@F/d `h0 I 75D򓹅i A_9WJiY5\ yRVDJ: `^D ݻ+1}P+ıe %i-b$ y K+;+"GWU-H0썺<)ߌw<[b߿~E/܁]1j}GxgWShr =2SVkb, J20Qh-1U J"b9L1,lz0V>/*vJ4>  Y,HK;I!.%r̉9%Z~1!ޣ7ӷ4T3ŊWh0 sewXe>eUrp7=+ G°9Ƶw{,h {UAR~L_^A [T˓SqT7mxtD" W%\Ą ۱bʑa+ex@np[o, pAYrS92Rģ/Nm"cT;\DBe.ڦ6Ioa, S1%Ŏnlvο_žT,FZ[1zj: v0`꣯dTbJS-fz#?-QWa L ` p2\p!"5γ,^k@yK3,%: {#ּ_?siܞhVocIOeP"M":X+E8ÈPσƘH1Qz&TCZg_6񐕩:n8LTLYY9F[ńdY3zy 鈏ߏV&i4G2rrceiS!h[|S*зY:sפ$)>K6JY;L#س4smƷQgW Wq)U3W/0jN}r_X( e+)H`l{31v, GXdLq9'F\n~+8?7ֽef5~eG/0͎kղAb2eH0?GB}~/ƓHpчOq53SLX\~zڏsiI2Ea|D>%eC~K'JbHMngq$=:cH4L|w7ѮT$&7mlݭQ'+m،}&yA"OwyЮS\<Ru7u¤p˶,tzDYuO1Y%e 품 $ AHv AcD#-뎂I@GsZLIې`FMz|s5wjѰ ߼¦Ŗd~U|1vpzzMӝ=]Wt7mK"T12RH{ۗDbjir8ň=l5^aTlbz7a b"&,B1FaT_yqYL\E ] fL@Sݡyg)xFe- xfZq:B\[2.N*N>NDLaBOws d-`>AmT*`}ba/Mas0ث' ЬϹ0kt]`W%NKuTnQjZ}C8#qKSb\I176p99MoM8{&|.D; uwZ dgwό qJs5D }NaSM>:}Us枬׆>jr_^ h=H֨!2!ΐ"\ TL聗}L%SB=_A0Ű<љF)ڄ麢?}@`4vQU)-E:JH<\F%b>N 4ðr9\r)DgCFYrς cLة v:,*hTց`i|uIJ\믔bOЈLۍ3$5s9guiîI9QfN;]{Ld&vKE:;® 2E2tZTF'C`'璶lXmNmUk^Z0)K]Ҩtj1욨=UT;j%;wv3Yޓ%zދcWE%WRZ)Uno,@`# w/L{]DK ksyvEu?_?_Mi8 ?Vv  ~4BTAKf4Nu91jс+)2i'50#(RҒ{``GF ~Ƥo_j~]^.z`g[Ȫf@3u TGk0{[lt?Lx|n;$ŲabDŽH TT @Tbwt=ꍮl{;8._HI#;'Rbi[-,ڜWmV]ٳշlr|CC|ȜWTj<}Hf'!bɥo4«TQ+siOG۔;I>cdlfz..1ą貸]rpkf)+kao曋E<⒰y/'JBA-֖sɂ)+DZ62FBZUhf^)ggPDB*+ 喒 WTW&|3ٸ/^]ba[abȊ]XgK} m xR65f8KO#*aiz /^]_@ /*.$ɖE7{2ap QH!- 8:4Bb j)Sf#4֘Vkl~ .^]U->[~U,C0Y*`V-"v"A%,Viϛ{kpEnH4`t G9mlapGg[gY̚sQw ]cG JwpSPrM~49V8nOI9Μ1=ƣaDq"?:cG2#c5!R&CH;aG3g &呒せ̊N2Tx+8sR(Xp,QݍקXB}=CU3Tn' SB uVI]D4B]ѽ(@DcLo]S̎sز>îAˬZd}u٤jرL#1uhm0bd{[F' " 2gGܵ"FaonzM.q/qAON3A |IM ;J:xb~L_3@%h?ǐR0j!AA8߾!Kˣ{9*0d ֌xNԒ*DgQTA(b%SYm;f ЩwxpSn}QA )! K4K-A DL*-+# D =Jy, ˢEv*ҁR\Ƅ֪EyNmd^]iVմ:l>YIa%-N6Ȭ`L*z??^jP9x,]Y6qޏ@%O.R|4nR4x\N7< :PaVa+0جaS=A#)O¾?.#V>5>@M&Ӎ?myv.ӵeV_-&I7IA8U1bcAAb s(l˭t"˭t)f|bwWɵOLUwsKT&b^ !4mK/9i@ee,;) Xa`\zPqjQ6=u 3+ :W2`CR, D8"H`ZApJ<2RDF8 Q6h!}g] l1ҐhFV )9R0?`n2B#" S0H{6"y!Zx$[pu'JO$g?40WNW gO e[,.n>费?<~D:9aylP0i^ޓ3gש*[B$[,L/;&LwD~"0/ݟ1` *,IyXHd\v*)OLGBGIsHb Wn/Qk$sIDYk$Gձ8Gh| bZx0B $Hx;?a\V8ۮB#)"{2vk  qDIG %o8e $@`" b`mtUp`9FPGd F A5.X  :.عu$8W+ᄓ=&CM)4OUc?QlMX,7 *QWq9X̚`L'Z[EQ$9<#%N=~ɴ)6J-Q.IH.RlB2r2H3.֞4HDa#)1+ m}9: :یT쪂xjk\Znc@ }hMv'ʘ쪚ɮ*Қe zS8ٌ!! \vQ 9GLӲY\dG}O.@J=zY{֙i0C=ģChR]AĞY]e2[a.\_9]JYk|C.yǚVR'ܰ+@QP@`᚞àu:#]s[^e܂+ mG9f|غys@`ַA\Aߨh }RkK.vT.ȤvkkVІje,Eywr+TU/DQ™ҨdG,EkUe6;͖rBĔ*VNnRcLL>+-; |nQDk(. ^!}T`ǂ'xLHLx ̺22QL6} Rlʣv*A-A"CǗU>K!Ks[庩[-;Za#6G=%MѢt& +Yc&=FHO *<𴤾iSHp>X~ZISjцHLJѬDgϟ'dtS0}h-Rqdս,R~fa^ӯŜ$zÖ4Ӿ7Nh d)*O MmO/I fFy̋!,i3k2cpC+QUPSd|F(FȏO%6b2lb3+۳qQualutMȭkMyr..T&ق;BnJy.{1\q5߉~^B2^d+,FDR$[qJ@IPYP "@ZF CA*v @#?J('|?2QrdȷeLŠ4sV`n.y*_Bԑ2itxĬZ5PT|)A;hDm0`$*h`6UXhZˣϨe f)ev#8K'Oֳ3Y"TvD?=#fr >1lG?Ȫ_X_n4b^1pt2$BО5ڵ&ӫޝBD`$sN0SKࠋWg6c<V0ahca=E R5BE*8=%EP,K:>i/:x^$ƊGDcQ Shq=P14jdrd8rj uyȒQ,0 IOo6IڬI4dթxf^Aܵфf1ʨSaeCy铺NnVv@I&q(u\y\^D[=JLsAer_&[7A ٌOНEfݠ|)2NytJ5NHyIC.[P,zʠrX I/5H>Tɫhڏo.V`e6NmۇK4|BuؖpiE2nxCR:{[,rEa9Qˉ~myPW#> 2V|?ҟOwB$&ijKG'Nj4 nhfINR~9N[ݨ)wr-߈*׈|\p !"VilR6D6Qyp[i#5QPq kT! r"(V zՔc7`W9|fJk_ÿ!!_Z$3J&ѭ5XJSpUdiẉduϝ,O5 A"Sӯʴ%0B?M3mү~C0&:~үD!c W< RB҄Hcd 4X(,7vf\XcHG^sكL#3'Rݯ$?F^X907)L&ZK|]g =UjPa`YL+GS 6xr5k]E8'x:vƁt؆if:Q~a]< |//ğ<>Qc,2f哨 9D;aE" <S`jc}oK-\ :^؟x6t*uy09D thgap^bBXVJJƾn*q1A> 1KB.?_M:2lǴh؁U4c_A?8_Nwo 8%b;(p(qL}굇f%SmG&۽/yE4tPv,JR#y qvWѾfXH b["-1roإr؂(@pY @յ){FO܉vƎz(&#$0*tjFਜRtg3УVɚtJ;4RH$?)ձI eWSۂE;ي3ۂEe ruĖ?(/HTz Vy=Q*5[Ȯ\eFƟhĿ { ;o䘃YS:NȉS+"¨sf |7M(f6J c̠?KDL0=֡q{kҧ(ֆI?eQ}].VUH։ލ.}{l*<,z_VD$V tt+;ҳ pFέm-8{24yHo)xqiY_/:/[\=_3pr|_]w7 i;{5Ρ?V ?9{6 _ab&qYo5̮4%Wf?^rS2w_ оty=?sىmd~NHB M/,icOvT8{s$NSbFg'\Mgh\dsig?vhHڨO߾uӫNٵBN3WX؁G;?gULJteh:쏒7f0;·|g1M&ƃ=|o?y{B~ٺŸQzdxdp`kş0ۏY:ѓ;@Fa Pzǣ[B%BI"g< Vz63 (I3 ioPe6UZ$}dd:O9{3,X8۷w>$SfqRrnX/oǟf/G^pqz{=?lo"Y^>2#Se\KFS= nct7_ofП~ b.LS'}ڳ*hō`S[LxSuXA ˍڲ7 `%ʗ"4z?2 3ҟLB/`+ʽcm)c?1mvl 3e/[w2..@}q h44gVV%RTkՃSRY*o(ª r!4k C"O *<`^OY QߕbZIE &(*F!9I#zJ-lݦm:!YnS6n^iKϼ60*f܆hLBLux|BcZ)g眵},ۄq\wvQqE["`j.ְKqi(2<|.qgBhv mE:*WULbnT volyFH@ (\} 8Q"/BRz BQŒfN͵P,:6m#mt*0rq{E2-}G]W iZ0F 9 z0gq AEkmVEȗ~ /(NM~* V+ˮd)pWV7ݕew\|Y-9 ɇL)%y}O3ny}{^oL0n.>]+љeL iu湶!uOZc}`V.x\2^ֱ"D(,x*?:jM1{}^{F__Mog=}z':/8QIƂ9d I+cb Ycwhq%3ytZ9@LsSj)k CU S!iO{|BdyL=Y^\Ņ}+xiy.3ʘ#(3LRfȹ?/r =]ޠS H0#tK 1o>|y= #X$JF`;m'sv3"2d5\-yC=l6DZuz*I(@U(g[pZTHbiZ G&Yi p%&i\"1, RmY)œd0"&VX[ıWd9$Hwc"²ybs=JY"*&ܛ@,$QǥB`0cL\+ fkzZպ^Owx8I9w?HI8|ՂBV`=}( ;g\cSy^zk}# '"1n) )}O{A#J;Ӯ+RfSm?27iڭUDw[jm_zJDuԍB⏋I:骜 "T`Eß4^e C5F6-p\|pu -5WwzËv㻋ѤA_|=(?lwd385g8-8>T-w3]u(]ىCGl0+n0;^m,$;Qލ\hUD1CG+I&}Ig^1ꥰXf)lhQَ6SwzSC`qev$HA\F:dZwRN|*s}P4j&SL~}O} -CusX3?E XHҭw>62yc"漳I7zc6)pݿu  xErh=]O@졳W7_DhPƒG?OotF|_TpV)T?36oF23s=x7[Afr-8&yb\Z΅7ɥxҬ2eJ0ZjVYub f|{%Ngio9DilASWu_! )iq0lӛ8%vR_G{}9oetI&y9o{]x(A>O/>A92#9υV Ӏ3Ƙˬ8^(Ñ6ѧ\g%}ýƳ) O<FMT 3Q^-zrRR6}I}l͝C `@fvzXr "^8q;%L*}e}K;jD螈ٌtMqs 7&+Ch0I qMt6WC!DkBIC>BkDH D0ie#ݳlȳ{#YKml=**]0~G\7<[`^t 6b58-8>T-vjh;Axm1 '~@ŽPW:rkuBH% BOx΅VZEs1;޹[ 'L &oɼA!N7[wL֦4!lGVoίs/w(I h#ICIz]&4@MBl/|46ݽ1i 8P~`@sS ɛbAqXs%CީjGewQΝ*GG'/8c)l\`c#`#ZBh/A8xd9-<>(~!wB}-(ǺJv'<4V05{j(²&w;J8sEm]_,'~c`{$ (}"}3ecN;c!yBqn&.Z*TH Xsy?Seh'׷%?lgkt{\QÖUiNV~& 5J0T=Jd'*]S?_V;5㈪ ,>BH|huV8lbmf6`)$c B<Ocw) wwHebA"wvn|ݼE$(sRŢ}.fUx@LU(0-vNrfHemc0&I` Q0[}8u֏SM5rHFX4~F-iF;>d n:5;b#otTL'+#LK K j+a(RjFlTvulUh\^YW0_Y,!H68dKK=, {C3̜"zWU Bv>i.*I;F( \&b/$*Ɔ$&tH]jCv2(Bc ϊϒo0j¸0430d :)D]T;$k!@[d0E.ٱї,se3D9c0< d!dR$ `V_ofrHQ1xRlCDnJ6]kлR&aFs=H/HHK46F!.!Wf>;ZX6(r(Y_ᮍ`;4vm %Ad;zpP.`d6V(+/@TɩzY98|Ş9(YHix)ަc"d=֞zgmr0dg6M^q9cʗ(Zt͏3ŀ)cN!T,õ. <9Ld/Q|GGG;6crݤcNya'IVeJhB(fQWPS42 ' |`KU,Fc7Zrg FWkVCmNo#iUA]J-pxfq|wC]@#4A$=ñ(jB4hV#&,|R1%9D_RV RǢ쬮<ꈇhԬBxyJOTH`jBOkXn hOGZoOBft7O!O8aAcFxٰdn =)]᥏gو揄=MS(1*hݱ;7fXITJlZvR` Wy񊚟bL1wϽ{#X.Ll{zջ-Hpk4ݾN9|U7닶95!g5^Oz1 G@HV bҎޞ'^O>@l,r'SoԪ,"L6,^*A0 "t^SU%jXв'T{TFjk$hɒʣdLd|(U`y8 ֘a[Zv)L7`u#v[ޞx'5)06|y@ɛn)̙<zS&2mš'XGꍐOaPϡ^Dg%T;`w;cVL;`& 'L943'.JDǪlt/PO^Y5=<Ğ1E+qPl&, )kI VDZ@I`/v?KA"`l޸_7Ol4Lg$;)gv r$Iq[Fz/ r J' PAE<eOvܾ@, 3RAw@'Gf;`x ;vȠZ3 $jӺ먕::Wf#1.Ɩ'ީ 9r*빱.Khף`ys)pxkL4h-t f:e`]R7^Ț'`N WE$>N',j +k6x5'0w@q}2fٖ_tm꘣oT.vޥ|Q8*IeGjOjbϘ;;r*V3"??ςl9 ʂ1{mϐ>aqK9hb4z ZQ.6sC,M̪ŧOJ$ΘN>=I@@IDO}=-ӌ:1lx^˟UNWc?.ևks9d OU]Òtח/f-ߪ\tD/պBxDQ1Ep~ȧQ82{[Wgᚍ?~/팹S~<_-.yB`-IcOތ_'!#̠;z9V֛qB1;zvYwt'i7b^iq̗kX}8.37Tˡ?RCI{~ɷ~{5wu?:߾`nԂʲJQ.`p]QŮ2mHwgQY %׽N?׽lOSwѿj{< z }{m":u;9Js`&ǟ;|kK/W7VVFoG $BC`|gՁsEkx8J;C4`~I1'.֮_~amS6 {'itiu.o 6Fgj7<jn57a}.ջPF7L7/u(wA_޿aB;궋~gNݒ:?cZ+{tXٺ=ĉ?ͺ'dPK/sKsօ:i-jY;OmQjYmZ8x9F>=] ?H4蕈}X`kDſ:9`[J>j兩ʀuܼeӏ]pyӂ|U૪_ FDцMEdHΥieVPŔ4\R///چ܁^^=Ble~`).v4F?nǼ}O .'^wq,X4FP-l̝yyA_@/BJ K#kPDrSf>U(I^,v] 6 2sðVY"ܤ87VQFSSyU=WSyU=W}Odr[{E a\U 1I8;,|2I4l"W[Sqo߽g-ǻo\ǏobB8x7 3Z-`{m~ jbϘh>X n~[(n7fWX't7=*R8WԲ23 *ZcU1Lpq* MeӌaY3Y,IKdm=-uvYhc%^m,4 _DX-Suz^G(kWLԬ 2xxB\[^dQbMl6]T@$f+@AmП>|nuOp뇆,}4zHI*ϱZao}_Ug2S`Dq.1b ktcjnHݷGݸJi$01_ Z޵>q#ZQ:]vr$8~\ b"$eCJ_ 1r98DC%Ҁ ^e%8ɤw%BcsRJaDqZ D!k}?SJ̍IB|JQ BQB¾jMJ*w&o}MfWU9P\t%"۷%();]?)lMPPUːQsD$Xxf1^RG0~LGHaqf'${k @e0aZz aRh̜ΰ@[c"* ^ FT5N^4} B`D.=e@R8/;xUͮBo*rFxw_k"5jԭ" vq,@T 6z;33>nM7=v>6gzOC_=B' ׋cӊ/EOt_2 T t|4`d?TL[ukz !x#vX3Rj:-j+R?+:8z[RH o.:2j !X ֻ}Ylo]9,z[ KTM-R{޸-oPm`vA1jhat(0XD WD* c7ō`wh+ צ*=)6b, DǓodffKC\|{jWAߺ& /? ZNJgB R5"MJSL0^NeLL7^d2;&cuM<; 1gwߠ壁eٙݸ4zo_kLxp&<u=d^.>M3ޚϞf@t݅kP&ofw}{0+P73x[F/?!tl@w|dzow*\/[b=T(4γ_<'73/Y%:CNZiw3(_q@/\pK~?K~y!߽ s9Zn+ZB4 ?D4?|Y ϻZL6w_?~^1Ds*O5%XZ"mnOqC}) SQꞡ(sloUjA?:|CJ" :kȇ˔bN9c( K/j%W[MR 4Yq6frEoo}qy]xog 33 D$ЕWJ$D")CTp76IF\fIQ :İ1A`glNa>emIO`_϶mˆ 3PU *V@ٺX H ;2ͰXt.]م,(T"bG KIR-& ZML$F$*d

l^tk"{w>L]Q^hǮJK.#}7:{W͞a4=lp[NcT!N`#b(-v3)9mTn~tb &A"&mt3HQv5袮U4a_mo=-.o+26 ƊeI,C~1ˬok&LI;ĩT^0AHQݎ.$kq8o3._+]Ir,u=YSTaMEQTgm +4*R n;ֱX$L%nuPؓ ̯/&)) d#yv)r,TB!"[8+@ NzC5xCe@ˆuB@xprB[(nP

CIU54 Sʗ`.)o.X(T\o-4vx{ {L~-,K:1Lɣ1B[{9VعG;˗**kS=X͏e9I-ە77讱>'#_X¥YFKqpeGE%蟝^p{U t)p;ތC vwWW.yy={?AegfpOF"P?i塓2j-P)  V = 끊AXhϜpZ0#%(_rz,9ݤ&{^NfNFEֹ׌Fk Nݼ;V-[aoE|>? fу2 `.[,>.#T4#lGTeujq V^TStӈY}8rql~~98PQr@)g#ppj'$J18A&8I2qV4JtQ9F8$UN-N;fւFioԾ\WQ!7~Weg?2V+H\Ea "2Dg >ۆ<_i_̇$Iwe@bp<ﳵw*QAlYB*u+9bQ4r u/PH4Z(֛õ$VY-iaZZ ZZ@<=+I&6)nkHie*w\j[7&XW#K:7HCyRY 09p?XYI- xQc{1aG+DR: 9X[83؂Μ@mLp&$V#%+P,Fb.z&d󸙒΢y6v&wCoaaa&c;{::?f?|R7 }ytG4YE@$ v卥y:?+ 440;oc/K\kdJӗs];wG#hByh SFBq)`mj7-d[,Ne[}N!q2MhQV|"%ST4>n!M-ED'v2ڭcJɴ[4E[hL1 0B"yP0FQwmcJNݢ -U !߸FT{ԍMFhľ[$ mjDQj߭z)gw%[hLIMU#byPDtb(B`gM';k0Lv]?{pPr *n뚙Kg?HY]bfbLr*w6|C46Yˇ !<ݻI>fɻdz-jf?Ng?o߇i%oe4s.d}޺+>h` mAz|Ap@u-0[(L4VEG >M`۲kʆkHE=՝:gNY/0A^<_M)0 YĥqA2N}|Q׍8@Q{ %榁#$lu&U)HcljBDd`SA(M:BZ-12!TkSvҁ}F{6ܔZnmLi UN$KpǗc:t -2 -D_ROA4Z$I) B -qv h} Ǘ0 3e~:+oE|)*_/Lk6 E0L~)FRf藯>gSyi}#1\hCQ( aBBjʹ&&Lb3XFb 4y6jPA:钀)~P%B0NK90M K&XR0/  >yw6B-W2B*%.se^\U>tqeѕJuvqŕYfG1}Aa ɮYp!ode3HM]=oc!SoI,DlI"$Q"f i%u K ~~i׎pފ Lu9mBU8z5z!RԂTMPfv}>Aͷ({ys]`9[4xu/?P_\_ yjbܛ2xk ?F4~c *֗~&xb,@ƳhDEŗ Md38<Z5g@f/Dq 5Z+)ulgI$$ GaD#‡T̸[*I&'ĥUKxFI, zSȻ+æ zM {@qn~o&|K2TMV}8⨓cռa&Z0'L'(4X-=IJڙX xFҍڐ_=m)FQ7jůmx'*%']]+ , wիp%繲a[,6h$2q磻-j|TcEjlB^ݧ,\NI,ƆutUk'>)&)W!8A,Ѱ.~ \&¹QK4qI*JTBTbO"Fcn)u'&*))F>1bwJcG\ % SɍIlB[$qr\'*N`LJ*q-ه_9R0Ιb@o-<ũiR2Ԥj} cC+g\m5Ħ1bD4XS#DcC)84Y?vSZˍcT9"z a5IsEDW5E~|L6{ i5TqN}][b">PXj* ˜ڂKS~^~>oG5X>ux7 VވLvHXhF$y ϩ^? vqopi)C]{'WkkϧHt}6_PE:*,K]i pp9R r:Hk5o[`/„޷1[q.lpY{ܫHb:A]s^%/hBT oQDThg%f}?XC>'K)Kzfh=b2z}9e&Y\]'ʓJE1"bH M" /U[>T >5{؈Qx5 Qeb"U/b&!n֤]f™<}%a_ n У U:ЎVWJ4ux=ҧ7 zq7(W/ҞE"&rQ26f:׌8cwbw=0ǩ~ D{ݚ*Ⱥtzqs!9irU.jGSsg &Ԡj)2R!I$ h:+IF`>=F,?8DjyntHIcrL~Bf <3}0p(dh{ fe1*p7њ]ܯ#pf}bn-D0Kth+S)É&geiAPvf x_%AH)oq-NBHU.LArf>My%.5ٓUIZ (!܌7vGIXa\h c>78NDT 'zn'71ofMj=0Nʙ`sXt̳jS.&7auXiNPȰ9 5)@(oHgC~5A=/hւ$A ,s3uSɓ<|>nHri?9a4L<@;rv<"+#ӄ!8 ,.KӹPnq£n@ʩnK\V; 5jD BRQjXS4GI>wg9@ivHݳaڀ(J=6TpyZ=7vh!8W$Dij x ӡK*׮#A3%;06p69$Tw'EhY˺{h?1*FN`sI6[gjW[e>[PF!2heR#N/|Ӆ)2 KM>XX,M v8!$ԫ2eX3v&๩((T1DYRUj6Qɗ=Mos+TV ਬ*+¤TLḥa@gSkJ&WLIB;)ƁvG \ҳ|+Wt9"g(lT~ft`iRϞiIE)#eԸ+~E94etgvܲӡkCd75sMn#4%qtD2#Hݨ-HJ@=M3w;G aATI­kH Sa.۱tI W^?I +ݝi@ G۰LEolktk9UǧS(?ٰ3BulƽZ[Xv^RF;ˑzwo\84б'{TbDu*N"nZ$8A8PCQ} )cuC 뜟'BW_h>%FܕT+)=M9_bINB6E cz-w]mS5 *C=ilHB= *N_&vDb&xꋽ{1 cIϰ,~8G^!z=D#f NI5T9pPuh3A`SUi-uANmi)B1 Tt*+C穛vDsԷlzbFS! _CxPٻ@R'zOOQ3M< Iȶl >XS1A?;u3Ye G*nBӍ+En6]Lxf˃^:uSf~̓Û$o)Tꞛ+nH0d[`qvkܦm&u2E)EVcqwF%)mK1C 6єŽ7onZ kkL&^GHE^xl2\iyտfdzk;؀<==J&00tVdt0s\p>^nO$s;"e-f}N4Z}2;󄖘&"вZ5­%"ve>>L (MI#Q:(rwm_!垠G(b(nO4Iχc騑%U8IΒLJԒNmb"絳33hW?6aK27R[N/f\u Fav NɦiGh^drA}tW ui&HG fͣsrlkPlȃ{R gӤ+pռ6|Hk~";y0ɈtIfp̓| 㠏}bL(p Lv]\S\Xz4U=T !qEȞ"`g=~/h^Vvԥjrb'USh.)F|ű[P8. ʞ{V^ڂʒ9ĔwXξH RgO /z.=Dww5nupBƤyAބTl q'JP׫a ?IgAG^|9znӈT.(8;&kņc#Jyy8Kκ߭=wQ=ΆD+tT]N-ղt2|RJ~J[7uMƝՕ^g?AR3 2k{V轮WZ .闐z=uzw &J>7Uջ3B!gB8Gtޟõno#&e'g;,oפ GDXOe!@ƺ3κ^gy YwkNZkrEMz(wu#kɃ#poGv;A%8?@k";Orf1V3Hv<&H}.$Z !p1`<R"JsБ(G8uv% HVc=[#K^x}St'1u ï)ݩ䄠.ڽyt;UAr|S}7av0TqN.]o~?$QE40 cKw$]#nM:0eϚȅ >Wo@6Ɗ\ 脫ZԿ/i@Ƙپm`9x ~ߝ "W``|p^s!<t2t^-rzm'!s"G0xN ]ΊSM9Bػ,/TP?(.~v_ b`gJ9ۏ?h-Vh%Z<*`2 <)3P],ť ǩ%߃z&sYlgm;Â$yږRx~Pe=m*jnhRtf]fP3L:9atئ; lPi4\j)(n=BF{ݫKJB @`cO LiPJ4c\U4S:^3dnq\% ֔Wn^*_}8 +a HThFڅ#x'gBznfIqMau/So7 XJ#؜9mg G)vhclx!J+df5u xOƢl qխ^[V wPp>fX{B[ s)n8qtHQcQlRƣ*M7̆1)I9j0 PQ3i1udD*xd8ΈpdL,ؠ`UJ8ϔL0Y6˫_]k6eJTP~_ӌDs SkJ"qY(PC IƩQFٖՄ\$ERDs2`,㩤VL$ O1;BF"QyDJ];e"aid闻Ov\XMCMP1H-SoiָbQc"8b""#a1HȨgTP%V"4&=xr!gUpdu*~*~0i-oM*8X\z7(IlEw_oÿZmE&63鹋wYwNN?η? D^YYjWE}.*\})KWەnkpFS7!o?MDwc }ҲK=eRB{sq¾;e'{<$r}EsmʫW+>+T')Z*ѯ%w}cjW? ޘhv9%IVْWIF;_OMRheUH1/vxW~G%-Jm(cҶn]w[ED W;7;G[qCxQJޝ+\$n>A% J:MjwrSCtVݺJ )b}Փ:Ri[tW5Z'2NdˉN~9-bV"US{aHhWLV2Ծh[.-y-åw~Vo-ڛn0u*/:&#lt*,F+ ҭ׎u#KpNNm)q "G\ f\p{Zi7ZOa\ရG+M\v&9*AXcvv[T\ꗫ=/%f&TSix@>"d@G֊|^=rM|ŽP/vqfp-"-[&Xu(@i,͸QHbHhd423F)Ԓ' |ۢYSPZVJ>޳aٵuDP)mW` zoQ%`vgt2L'JDHSHE\X$8K B7 Y/L]NuڢGܻZ85s̥rZ:",.ib1YO EdU!~O&Q :yfWϗ؆; -CӉN>)Nw+ rvi0L%Ti*O;Kz S,J, ZoDlJü~86[jmWBo SYQ$NbV}GY@Bz3Y MpR^=Fh&J`sMPGf I?#HMWL0}|.3Lq~47&\ŝL׫y?FБ€W* rx(2m|-`szH޾֌b[1tT 4!B&RT03<QO}$<j%Ē.VT0m%RL JЋ(WCOJo@d Y63|"Aޜ̸:^7CǛ&%RE[/}n oŗL)EyKI IdYBE,>I)k jD6i:}TN8R7݌r݌i;6GH~*DŽ.!\i7]ػd]کHdƾsꭸˌ0},9’Λ `)k\=ھl}~E[ӈ|[t$IB,BĉRdWXRc章I ii÷궠^] rQ/x>DG:$9CY,Qf Ytvf%}V\R֣no*w c>hL7WZ0jLƜF0)~]W 7^Bz\Ɨw5F*՜,Uh}`ZuKTBg+Lh8E13K4eHc42ij&Q  xD-tFr~rE4KI8N%ePxDsL2HeT4Y$X4T184AS 3R+LMy6/v7`z1(k e4oœ2J >4 %0Z'Lib)y&.`"93q{i$N!, &Β*q5:K0aB,6 38V"Qk2d # 7$1՝M\ <nܷV%:eらӟ 6e#.+~w49]∳;gOv7Ka!߸nS(J+6-x8|E-r)8A,ZK1yHϾ9\]>(0&pn@b@!p?7/`b ZF|~')g9w&+)~K`mSn/L䄔jPYt~Z5τ*<~+{WJ;o} S΄93ag- S*&3aG~I wli}ߜ(}w&J̋Yziso߄BUwDݸiV$sJ{$,y<|egf8..`q}PYψ;pGoyx4 _ޣ6Ꝋں7}m)ikz(~lr |wiS1h-WٻFr$W,i> Q;]ؚ~luɖK]}`)YNi1GfjTwt _PKus QkY=-8[3_=ɓy1Ky&Ш0mǻñs*4惁s )̭Y&U"W ť6S'Fʹ[A7w)^9}g`25mu?jF1W s42~|М>=2R)Uf8M\H$8 HgR==g?z1K sc#ҙp9tު\8'ڛ(HC~??TͳqI~ҲP$iiJ鵵'˝Ԧ}T eIQGk_ Ba9Pmh7&܏o-Bomlno-D TZz9v9 s=P m >׬bZPƫydhRnǞh)Rhu:tAB$  0~ A}MzTG\&W"ňĻm~$k}۰|:ߍ \Ocx_worctN͓ |`*9vMT@AS$V#8dፖJ%FK2܍y,.}[uY2|5cH4EfRU)X#U up5 j }鵬2X U6TmУC1e Z5w/ز·ØbPȎ޹zP@I %tЩ-Z2|#5%&;jtlĆ! ;olx#Q1E)Ȃ`Ɩs GMu7Ȝ'_o#$ԪJfO7Bң'ϢF`> nŨa6/zG>Yj>>e>AEO"q9-tA(YFfk)X;Z 6o$PzDv!7$>)0lHidzT^/i?0gM7"umM כ+ɜBTF8>GqI=uOu CY`ͺb6.ei|wH(D%'X%Yr jd4VdpkV_t?+Զ)6+&"v۔؇[A#ɹhu,J (KdBdI9HgZ|/yM 7Pޞ1d5syLRvCV.uJh,Īȶ>wUd"ۯMUX{-EahQVX ^*.|::T=zhP*=ֲDx;:Q倆BpB<|Xͻ 9&U(u5GwK0JQ^m,7$@n$b,%'ƒvv\yƘUaMSm#9H3U!kѷ!7=S1!h2!cQ#ajZ.@sfGY֞E`*7-B1}8d%CCP 8ډ>wއqRьGsz !BCr8s"yXs4Uopc"oC Nv}+nin I L H7fIێP#\;F!Bm2ltE\ CkDkFoa3`Q:{ttwKwkDh5˧OXjSVj ~KJOc!AY..PKG+ʡe@P=^B $5n2s~*8ol{-jVcTߊ&ZstQ^s@QJd^fϖa׫QfOĎfʦ-JV#Uߊ4G0gG O*]m4 (2,sm!9E)5 ?iP"pIJx4gu: ƌ9T'z׬V>x2F)RO4*EkUIg:=f˼Vc%zWv5t q音$[U-Y<˙$UҼ<#m`KiMnL{yStiVLX_2\rbQrYyCaLI::;{)|pm.=.*:X6,*}^^T7·Ø; 5ŕٰ#!dMFa0"Ċ^N6T,6\HB-*]IK %Lz/ň <̙>y8;3el΄,`p„$q)2#m#l ˍd8_&u9]ݶAIb#٥ o f; %x<:Dݢ`<9v.'7>sq144|G"Rf)-@+bpiTΫ5yi{Z B1N@pqtvay,fPTE,Wz^"0HRFʄrE&\PrYƫc^>`҇O9+3)H2˼" Vkٴ^OC;xшuH;db]fέ='VJY" ̆6%X-q&Z`k<6qK403@l=V8 ձns1(Ɓ; `8{;\ĺs܅2)~Ivr4RӶ-(XՒO{4˛!EN=I@y.K@IBݕ;,H&o i9%uZuӡQ apJ,>rɁ0)IyzMgwU)jqwuvfqzj„Zq^v3Yxy#pVzN"1=/h7!=Ct: fQ!,u:ggo'T E* C`ɹjI]!\/h.%6lIC/s;jӈ㻅[\ elݕh]möٖUQҲFr0 eѽ!/z Lό?4Fojݭؗ6K0 6FDxc>$N/szc^ӄ?5q\ʢ0FPoVP.P%,XR$*2cg*%q9Hrtm8* kCwmm e;M]V}ɖj. %28l? Eɡ83'QE9t`D"47VՃ:dқ$_ܼ6H=(C !A2 H"@Ib$""0A$0JH;ER㵔 INL;rgҵH%O}%%K>8JD[n/č$I1aXJLU'H;A"m+DTl1ŔAU_@@،;%,0ֽ(A{cH$NC@zPJTk}!CNǘ@=oǣV-Rifadb|u4uܷ*U,&%DFB("Qj$Y504La׎H[ L$ibBv&R'{K"FN&ח&̹0݊1xanf$k=3LAO"4[9I }=QΗY{nPvp[W?On\_ ϯ:zQ]_O:zoa{=p~sT{f~X?=MF ԑG7,(q *?sƒgbBH @@%eFM{r}_ä`MKFLoz"6g%< `5BJH9}˹ZHpx9(eѾ 1?\ Gۨ~9uԪ"Dvȫ\74BGuzùzPخ,晍U +3[Z녌-!rۜFYo[wSQTėC/k(UF~z&!&%DGgj jCR:˧-^-~VbJiK.iJRUTجF\żr4hǔKe-;ᯖ6Q6N&ndU7.>yK~[AϧX[}x@ςm7BTHj)J!&0v[CqKx# d;SR8aJJpI߯ gG8xk[pwG'NZG[(l SoB\@kl-ukCo'x:ے)k_X[5\& Ͷzo뮝Ʃo.,>i_EB_Wh5DnuV. ja[ejBWxHFQ/i9P丰 df>x@-P (́=I)Î2"H-do;^нL?;Z0' chl՝1, %3 Ֆ^:NH[Oxz|DLK-#$|'tBKh~QI:ʓ-D@(0AI) PsWH#8jŹ&4A`A(bRPbz:d Ċ>`2U<,.jmI(Ǟ׼ΤRVb6(Kf QW '5+u+C:ԋ~jp Fjw}%QPba ;A\è0D Iƣ΀t%.Hֈ+ &Ið l{I5mXu*i@J,+vuV|JC&V<L4&@0a}bQmX8c!-2L-D2HCbء#=Rê2ªd[@a5Zo~e.QH* E`fh ;-IĦ7 WÆ{?o/C U2IBM/糟3~;g~vx1zTEI"ݽupH(S(C+b4IRģin3L^5\m3ɨ^6;clVꐱm~ycc\FI+B )GU>jѩ_6_qQ" I8Q5RcPA",yP<$q%(QҫB QIF1faDCH E"(V)Ld%("(T;bkQ=joǣذjifh#,r2w@z}I H7#KC 1e$$(x@D 6+ F /_mG6*`OZ, jwqT.@;=,6ZLSn HUg(EfjSzUk~Leo|qDž #^k~<*;ϯz㈍xNfAc~L$Uma7 ,kHp1xw;0qṊKBknTBK0ۗnQO/=@!úSc3 ^z=/ W}eN73(QNVj1ѷ1/lIgOo$k(_5z 50V;<2_5?N\Ŭodyv#~P#mfZ~?VW+AՃ\O&t+22' H4fJ&ASoNX>[3?+ϞPK@ TPf}0PDDO>yjr~4X^ ¯X׫=ęﺞה$<"r0% 7]Aũp6g޳ITfSCcSmI4K6$a-$k!eڧ:-K v ͎\Rj%gnITB{VOJJ;5R{]q2f2 ;JaRCRw{/joX-];{^}'qHB.b%UBLbj0rV78EwqV LE%<AXn=vl`%q0 Y!Qd%{+  Y@lYM%ܥL|]*NE)$MXL8 c8!XqIR# 6#$xGѩ׳D݀6*i~H%;PJ6VQv #"d$,XD$BRPbP);0rBaVXA+nU2/2l:F@ $|%n2Q]<,Y\C*KO= ܫMRiۨLž^W0gACCV '[{Y7?a@kZCU6BvWA'Ow'(pW@k_׿[~m(1NW~w˛[XET2HqInFD] &%ٌn7QDAk;M5 T9xT | ]r%Z-RXm0%?0昽:lɓn `yZ61V]SںnBG0IVpXaF5R3ync0i@ SD)2J Phv/&+a4UE$D\I&m1*RQScab- b/U"I vN"x5K&\bEh_XqSŸz<aW(® rY<Ejv5*D%Ns6>眞m2X`Ws vr^bph0Ī)m+63xY"3ͥn^cn\8BZv0$џ,43*HbXɎ(Q>":\'U0/y%>J["NXUc9`'kM'0Q$RqrL{~4tf^8}DSu:oЁ[f#sH.e6+gW{G2Wk!J%C=֛z~Lx﬌B[S ;E.n X= D\m߽ Hs ɗѧE}:G}:oo1P|Ho7!=y7 |[W{ެ}?77+ڱ)` 24$V=j5" XhJ>'u ?+m ^6(zjF=IMUQCGD8C`D\Y(y,+08}e՗&S4[C6r ftEғtE(c5R Ë1m~WV;.=.4qhpY@#$ZSlM}髏zNwbs*wNjK0Yj gͩeuT[1-Ϝj.uw~TЉ OSա(TXX?_+S]uT[TXY)B (J~r@56dp! b0 Ngo fxIqNfqM"s[iՈYGR  ^Ϡx9%AP;:oN^~,|8WJiY6= y͟o/MI/`bm,"ֿ1844Cd.Q>%b<*wa>D[sD Kˁ+,Q0ft2媚u}vR3 _wRSYɆ_' JX4b>co tb7T ѿ?^9 co0sٍC|-RK]y> .?>)÷8+5o.^3^f>A(v0fսǛ9W _}0{x+ޓ!B[Dq3s p$v_~i?1vFEF"5]?Cb9 שwVN kjYMOG) j,._id^P Z]?3,"3nlQQp`YR˱̎Ǜ123.!ᦿߩfG*@2)Ș"Ӓ Crj;/}faQ%d)T 1`2GU/>.ŴK'F\J³y~im*#98?'\IK0A:IB$~W4GȍTj<68L _͚jvp} p'7(ZζVV5c?2PmO{%6J lU_}hk6JTvᥬWZ$}UAӦLo#wjs(Q'l4B&) )*rlr$gA!*55kbʕ4VРs.(1YbF: q;E@IVj|=o(5)7hb24:n$`+eȃܞ:H^jC(65*٤{9خBRfy efc=,hR)RL2&Uz))r6HÜnHNc hN9qa֍;2|4(> f?)f±f+31^$.cJL[3ϵE`g&X#\B$RHs *X\Δ7+Ʉ4y3RݘI+3֒k)BĿS,FB,QwVƃ L`3)W\ccJv0ާcA@ !2h Lā<|$',%DV S *9-Z,M02d5Ovaf5xBl e;-f0H7#He-܃3'@Z9U,3 Nc{MsFC3 ~Fvj$k7ɿg"DBD;2pL2,ʹq&s t%pZsYOD["1$b9|H{p,i@IpB޶#>V u,(iqKMm. %;ŗD{ѷl9W@i>[ 8+J_\?Lqcz֠O'=t_UW;g iJdJԵ:U_TDUFZe,u)jlfgJuºh~I*z!+]_H[ie,6پ,t5]M` ٷwUoC1'||eZ6g|BJ܎AR3l<[twy.ܛaoh0)ŅWUGgf/z_yli1)</2Ӕq>NSק)7pgӎW*[ܨ&0 |py!7xp\+?q>W~M]9sw>RH9#帯SzwSyVBwwR ʻ*t@g gFy\ҊƔ mkDWGVK@ib/ [Д@Յ/5<º|IMvG#I31;YOxEDfP4xi}x[ϯlj]&M0hReu5S`K?W&r)p>. n٦aV =w6Pzva;W3Wg6ܑ\ڬWWGg}{=q^ug3Cmbi ϲWs[\;PiW0NN?m=SO,S{&tCSOO b巂^b[3}PO^_vܻk"8 }ٞg,^{FS@N<"Daws\q47I fSAhMN&W`n< f4>b{YĞ 2قL4jL8 V9(hv5O;X1_ ϯO#G' G./^`LDxusog\4t[1Mg\[S;.0ҔK"󋂶 MS }y]245GO1ކx)O )I2 g%9ɮNA<ℋ6K^ϲ[i &P[ѽj5LmQ|MA Hrl5i|xd^kj}r[o{rOWvkj}r/[hTԵFۚ?mL䢳.=t\Զ%E6UkugC 5Z>99-4Jӗ2&;9:>?"ҒTl_c"is\alj+$yw ísUNg6?|}S$>v "g5NAдu_q}];ߒ챻;qحBX礛0Jvq+Zeϟ0L̠$uxw7]0j8Sj"X #^^b_M qW"\ d(J(L =PҐX#2VIυIn)L`Gd9',6&=Ej?0bފ<^T4@:;lm F=7+B?B)2i'WPpαל822bY b xX @w;T0iţ= )9`xHF$ Ga(f(U&x.DRhx5k$[.IPOM;rLM9jISkT'J+w}61Ƴ>ZJvᓀ3ȬA@8DͤFZ1;B0B8p^3py @}=\6Z:v/Ur6gM: {_ֻiF{ΌL1#0 - caL@Z%9<&+?֙1SxJy|t9]5{; 6^ 3yY'" N{-@$'΀hW>ꌈeFY@^ fܓL9&3[ $Dv" }Ѳ"Wʔ.*e26z8\- quj0*4̊X l$:Nl0tB@m ղx^L P],Е;ZX~pv+VG-,fS~}e: =?v&ҿë]op))J#wUy᫷_y ""C4O立/çbY_χC@Y) ͬvߛ^3^*% D^=|͙Dbի\ø UT ~ ' 9uZH7':pd;<'I$׌D\BB {.X 8%A&3•,<'|z-5z]:RiWEcu.0(zV0X~oǡ mi}Oc "+}d>_kÄV̧#&"7 -9sz*SY{kUV% M4ɦj#/5sMA-ycRK^(kޮo~;h].$}}sW?_Jݳ9AظEAun€GnH1ō#&Ÿ5 Ĥ$ā$ !/)G.aAb:FORלncFNG)*T8lU\?x F3!N8K$M.;J:xs8V"M)(hu"/%_>ۖ>KR1hBSqHw>uZ[:K?e0kB>sM)ųϧލ׎2ѻbtRqw;_Fu*O7Ѧޭ M4ɦj+zz7MEHĠv3eq, tN"wY`!&C'=^j& :r!H\"g{!]ioG+f7>/`18%Ч]Tx8v[==9`?/ݪD!Demr:Hd?^Vwi5*u:J|tzsgJj'\YXުJBR왅5*:XqNl=F%Z\%&vs}R:RJXUKyɉR[9li#zϕ6b a:fWdׄSʾXFCPO$9R Dѣ)8/9Op^у6g AYBk8Й|;u\xBiȫUeS:v9>X_ϭz*Ώb^HŬ%\7Fvb17F>h~dHn\_ٞNgo![R~1"LYW}n ^Zï>9=YC{\<~&njK8Prw1K>MqYK Fu-0NYj<..~?,0/iz<ONvϣ2-iO~9I4uӮq.L&.raYH3^GI }$cː]t $<2Mw;MRפ<;cwʺ)(\^Mt}ڸOB))Ceլ$+l.IEzfLi3W; æ0L/^~2Aߞ.^}iw1/8m+Za^r2y}P|kC?Zl`~Z ?Tܮ6^tfȮ44 k1Ἂt {jd3A"$/j[/ּm-Ą(ͨ*2Y=$|wo]3M(UT&egJnշTT/ עNY H'ݴ{GMW .F׵Dn"'mxǒ]a݋f$׋6+@QvQs=BԖ9DpS;/K!gVKp@m@+QQYE:Sx^"L^JɎ I/1 SL!Q+ abFL.aJ(#1^˕7`jd)gc鑫j|?9ޑ 8^uLu$G=G覩n>O'4ǶrK}0=k=Z=GԜx%V7 QKru|8 ~_UA?dMUvUyBKo ;l<,J9Qagf#Ғ.=UW|=8J.3>t+UqT qXs9yح?C˜ңehyj!>:\r6G"ڙcή7PuG^xs0/Ȥnt OQf3OZփ5Tkпvqs:XTB$ћ @?v6fa]um,ՆRm,Ն굡nut 0ǂ 9 qGQƹ^a5g+WC^?_uޥK+PUgq}$ClrT⚽EX_,%D$݌@n)l*v&bTKo8V{a"2uJX1hF! GY&m: vVf"o/3[If;t&G81v΄,^ _ dvpg%wY!2;"bR[#"` @l $(THWBAҗom͒*9b:Y^jA;AKEYPnr2CpӊEF"`mw{rH =J ҹX"FFoƬ\j); 3Ke ZK+NLF@4`>;a2aPhO9J y Y.cZYѝC(2LKsiPKQ!O W0=TZqV/e辜s}P6ag{t5-&Zzפi [ UcȻ׏ 0AMތCO]^AADt{Y=ӛ,`{oYlN}lu9=Zp j dg8}K<'=*1 7ZpC%0PAiauOcXH%1ip'YK86f6ud܅G¸K4RэC6aݔ2(`VgGܿ7WvU>IFrיO֊8:-;f11‘q\rНOHBp{7)How3'9\ȻI!,eHew;IVyZ]d\ۋ"vt)&!0T8`#9 ;d|7̱ɜEDRJ@Dɵ2aH0ҁSc2,`4^KeMA) t|I%X,.BH!M !@-An n]B ؗP=+R]c F$ӥikuV A.HeOr UcJyHT0\Mb"(/[_IՄee8qQYlF "deL=)PqHQk!6a} y/F^Qy }{}rG½>\X!T~MxW^ gpL&2zyx|ݙml3xr{䃇u|&hN޷!xo/Q0žn} ]}N$k!4:N~m6YȊ+bN#<i؆D|co}xuNV,5Pw~>< B܉N*"Wy3&x?_]TOvxU)]yWᨺ;&J O3|}rɌO}{xiiwoo^Jf_-pnw|)݆N:_,R4C"K-YPxg52g5RtÕMſ@˭0 2Ha%d )'[~k (!- ؎d@8ԊEa%ߝX}]n=RNB:O/՞0^cS;@l a4^ͦ/_0~: ahF9yJŕt.04x/=2uZɪM-M'q 8EpEHj)Дx>Mh7&>Kѱ .;T]ڒP P-ؙDu 45Ȟr:OǑİtًcCi =N6LYM(Z>b>$ݳLV Ͻ(`Zt9(ub.p (X0Z2 pćZ]x}βя_̌&E>?wnPLC}aEv:qba_?Ѡϳ9uF?z%˟>\NF37ZqԪ0|nSn׏E ^o=af#dQ쩅@t*bUHYl;Ex;IА 8R= e!%RX\B3/EI<1.gy9T !{o>zݹV d Adg8x}eoCPb3m!Al{L5.c4y!F'ϺaBPv`i 4fMV r#\ܺ"O֦͛g8/B`b , cyY|09 ECYY=Wi %i#2 ckL39y=V '[4\=1',So[̙<-QI6ybIf{ bߣS;eॹlަ+Ќu_[lG SѲ-S@>0pfLrˢUe)ZL&j L’#g'Ie? jjk?YQ@>2LMb3AFΞ]HRTvtuto9t$Zb =AqX4y?JBָ\JB7Z6/WUNV!_*ATq}yҖ9EhM>.'&B^Bb9&t,ȍ6Z m4r(B+TĨ$1Ĉy'U)cH`E/q뺞5 Y3ZImā'gQ>^>.qD[B+Oxg VISjҤ)#། F:~NHha<7P;c\T`=-0]+Jjlh'%U{$.G%8'\ʘyXIx xLu4wF9z +示\YY!!uidwVrwS|s[ UV>PrRV>Dt!¹6ÇZEXy[foXyKas,RoauȚ*=׫;@IUl TϢMcWX#P&#,\b\uTЉ8b:LXr,iL9| g,QAO4蓍 X b W:|$:yǹZ&V@C*Ylƪ̅DiB$HM!ZD꼉Sbܲ^JKp#Dv&gM6obvoQLSoF1_ 6]ܽmQnne(vjJw7QQѼ!81y3NK+M=K[WC:yS$Qޢly_^-HQ QFL-0L% *Nr&;-rSqwc4<`wMp# ^L&j:J{eLVT wʛJIZюI^yɣњ䍛k{{B!3gMsu|zj²ȩGs-"]3ڶ㝲Yxc~іQ} ;m/e؁:"GwK%Ӽ7)@4\HbN~>p&zJ@89MT۟\W䀬EBtC$؟ڡ0ݐq2HΕ(!DZhK7rPU~lēVFtZf+#Z15o c1}4{-7@ 0IMl R괔0ILz)~vâr7r\! +ª._o>~:[v7?~Oi53[Ng[W7}dCj@\&_4g'/zwg*A4j3*ĉ^4.4%wyrʢWo_n֧Mۍ:)W{=ަdQg|,CV~\)uhLMs6m>|&Y8F›騇wѽbRW8b$^my{gԖRTKȍ9䥻0o[x;%1@0-(և+a$Fr]!]@*,+a!G .wodN@'\I{SCݻn-ZeY: hUҟO]aJ##l=wNf1`Tqcyvn"MQ 6h۸иމ, *h'k[)i.k.WcԑUg+kN8;\wm%píA׈|["skN݆қMx՞;XsJ9)>զvèb8zZ:$ @IU_wPp3Ag=CvAw 9RK(F }F"%$A!e3CY  TێۧvSrJu7eVj%+'sv؄?Q (3wG*4,j |*t%S$5ZƐDc$i 7Wc^hck mmo\p[p A[JTPᶄ4*.x>x]],;OW t6bMNRSTo* %aR1-Ō:%>9ӮE ?YWK~:uݏt[T$VP)P" !R&R4/u5iؠXhEn%Ǐf9W=: H>um#=ݮñ#$4M!ޘ>px*#)ekge^tY}Bo=q< hP@p@I;Fg܀!F0TJNp9NPE4WTTk@x!iRXROHjmfŀM(^$A4cr^s{sQ~U?ẻ*Q ']qԗV6Hf3>3|viVr<%$6}boxԢMA<<8DdKh%;Z.@.tILֵFHJ-D-Uv )$t㚄egQe83&%]Pge1KMPޭY._N[Mf9+FTBV:sRREkG[Kk7TZ{"ŭۿq@(;_S/-o(Yt+hDcϭ>.;n{gJJPP&"GP'*R+ɷ-,J_$tY~MZ 3pymo"4S3;}I &Kpx8B= oz:_\;TܳCŊo<;T\ɍtpS*B1)}TdT ` ͣk2M&я a٤\Q! 5qGZF#K Hi$ds&Xm].j~Ws-XHd; '{ GY*Lvgs?O/M$p ӂLVSI6X m^a*+DGl[T{%MrG ֵR Rwp"c]3g_g?2zOw)E gklJbXSrD 2g7xS{OJ۾[zwpKVa-xŚ3f/6#AakXa)Wr(|vHL?HD[o-!B) Ƒ`#ǚǘD$T1y~x ڝPxAd+%5F9 Kj(8쁞 "3t:Qf {u6)>L%?2Cv?Ά;Й1qtsg4Qdžu{347>SW1~&$|6S{~8C/dq'Տd6n&w~=qIu:P|(SQ/UҘTc2:lQNV(:5s%2|w1Pn0逍NՆu>Mt60:`f5Pfmv>3p?X%Q4Bl'd`8f< XЇ_v/ _^/v)2?L9EA6.7ÿz]խQ9{4`WΟYf,/eГoo?|ջ@^IBJ /+CD CI L%%L˜62Q%0p e!H?%&W,6)CO_af/8=`Wp߮n%~W2yL 2_n;>ϟ :9T\jZ_ɿ&RĪU=(ipVCҙCj#95T>m(CQ8ϼ;l MSLp NSDXkG) +˹Z@?t;B8A6RǯJVr8/҂>CSƉfy?z+ؘ0avVP(N\"T8 ΄gQŬPuZuZJjDrBB#sIc6e`n08*%c`XG-` Mqc!4U) &~)LsL:I̒ b(Vhp+(OU: ^`C@q:0! "T0Sc0i<_!})֯` {0yDsLR&̄&dwu` y2fi.@V!T} CT qgxMA,:k3' Pc#gǫAH !gd ‡(_u;ߗݰLaog:\vE\<$D$xuجqӕ\[h:P=\ߙ/ӱ-Nc}2E(qٟCl|#3]~Y\4n$=>q?@!?Qgu~}Ɠh~ 7]?;ٟwdV*f!n-^Z}sg%p_hH4Ͼ|_%$A aPl-iEJ0{!{ɗ;Qvé>C[{qv猿 {;:C`4uG:e,1Jn6l0^vpR}:}xwq#S^zs73ى?`t>MфlhFi//U@@>QXu/Nx-#5%72C5Oj!'=)1-DOKjz^ C p첻慰?X0m<+\Sv pj~`ͅMG=;q8[#!Mr2,8|`/*tj*6Gc >^}BIdpA\e|۟x~y:BtSTYp'P B`-zT}K Iv< "m=g 5A1ƑR9I6!p'FMu>5M~?DI/J SW_u>5_e|@1Up₸/:Nn#B$ī!Futc{6lb(fHy8VV-R8*A1Xl,^dB9\>F J K(I@cJN:p%8e3&ŠP D! G  X)s (ֱa`7j:kFc"[Skւ5})Kϫ}ɰȵڬU>H`&etOpҥX^)G:&5´Q€%Xqֻq,Ӵbj7a O96r0xW8rZOXm(j?kB0jy} 3zvNvZ4s8rrx7 lAμogs7QQ-{3OBQYUoxhE%z &Մ5W`oL4߁fn/*4 Y:z 3^@LYd17t%EyvD\jf*C8Tr!xZ&T3!Sp%) c1%E)gvϪS :?_qoE 8P?a}K$E8ZȌZH3_h'T}闞/޳ڗshobū̡ b Jo0U\v_=o_`F!WE՝^SWl_x~ ~[VoO1g?\ ҍ]w%(mN.75͏HMÓ]Ɲhqʀ4EF>D][o\7+^Zd}I`M0@VdKZ< G9bս;noʷ»AzoA#TQf)_dek !G ;>gm5Kh"fJ[%DR+!H+Fota vHA>zAgZn"=:̿٣UV)juE(Ҫm`TJk/%G+DG xLfuF #Q+IqGDy`ӌ%GE]pGKU'IUQd5|3.l.䆃u>yx>#AE~u|:j*Z V|^8FxnZۓ|&ZcSb_M+[-%S.ۄ"2ƛݪ-oDl a7 k- gZ JL]f 7Ƞ=U Z:a!߸VTo?VW\vs67w^(zފ5T8L6 "ɥ ۄׅdiB iS$EkbL#8[Hg"mN'1$b͘u*EbE]>ޞ J"V܄4Y]y:knpTlgpz.V%%f6;{y12T§W*6GT"lPۇUJlJ?E ji5A-*$ruzhѨ驪)F ?s)6`\=X·Z)ԁnK8gR!)P%'k wE3ˇ dd:Ŏ+RI͔lP~K!YLNbrBOZysJ8[u%O˧K(Z-8~MGmU^}ȃb z)s5,Գ&| =uD-HXjTMcd AD<̤#iO%+lUZTDJ'T 0oR-WJI )#PzsK QxDz5T#\u ؖ1'%7A z HZ0 3zg|In 6-z)s.BcXm-sf{[.۳i![7' |){Yô~2:9~"AH@:-3/3hcI o(bfQs$n3n }[RR|d{GN"l{ :31+G)g;^1&HkEs=V+KGM谌<>OO.ìE k?ƅPh#UI7څ(;*>)lHC 0Dg>3kGDpXi_)яMdG8bn܇?r4wyC,,IvcZޮyA~w N3aw}vpYBc1,\c`3g9R0!Y_zC9z{;.ej_heS҃1;şo>g+6AlGj@C%ф6Xm9C@$!RÍ419еs2PSTTvOaɀ(dk#$ic4hJtկ^}=.rZ; u}pIу5Z8KUiR2u480=:4/!ޙJRJ.[_$'g/# R'3_!vjAhg.ZM`*Z$8*jcGrڢryⰬ hjr8[𳝚j1Z4#ރ͡TP"i@!qY.U>9"2qOI[!1QclAJ51(Ն 'Dk?| tR}VΑ)&E֥͞#9A'$e36K (J5O-#'T3-[uS߅뛳6KпzI<5wkeze qD-+>@-|SqVTo&?| PGVƉmҿl_v怿eku ĩ$gZzHmPVJ1 he"{QDX IWciEfN.rt+"V7~RڑX]x{dfn:A(D*-ƕfS'Ri%|ZJ{<ͦyM4i4۔RI jAyFXQ]q%8E]j$ӑ`,Sr־;[3S@E$ۭy9Y2[#2!(TI1iRjϭRxcIz2\EƬO[bZZ kmFW ~n;ȫ[J`Kf&gh Elvi=(QRL؀Nj<4/^ `{|hwep!>h@[%8M,DRjpvB Kkgq /q[, Fdvj9q7UZ-z Xk]!r0iu:o$LW!d".~A{5&nnyEk4qxg*dŀ}0zwy?N$4&4#6[qo5v5x@ r@_f/~Ba]f8b_Z -'O =/"K;-b=_| f}!3j24-N) kmgaG0.&nlNE@QT_f7;-MsF֒(rf~ sH2CUyN>Bmu9my;7 _ 7o nL|ЖV w3t vvx /,g /~KmdʍOR@esZO\mt@; 8 uGU?/Cڹ+jFc l;\WK{L ((14 X0DE&nT l݂GksA^Jpsٮ@&8歷pFV_/m$^HCdփZmbC+]4&!;7ÿAB As$Eǝϻ:BspWFk^܍[5 iL^;SZuG0\m?P:ۍYh}+Jf\:O}̨!4wJg?EiP#-#0PPmeTT6r2# NfwRٺ SlmPuAv[!f- 0#lزQ4k_AI8 s,zbl㫉iB#R,AHosR4's=&)Z{8"P*C94Uobܡ2])Okt9-JƗ8 (r xm sķ47i]!Ԧ|vf}m(aםeΪ6LfϤ]XO7S\ycԌ6e?OZADQoM2=ga-|8ڇY\0kfq U`{Nj(vŖ D4vXZK'8 q$3od0oR$D3KE.æK 1T!eTrIk0Af FL|R#*}a"6An\&`^iM5!?RM}"'7bs-o&'z&ڛko2ɸΚ p(1jJXsFs.g[bJ(P 5܃Xц bE_Y)l&E}˓,oVp rR9 XIU. e,sߖ_qh]Pм\~t2VHҊbI? dZA2&a\XBFJFDE"ҙ0ܨ8p.)"l:zjVy<,j# ɒpD6!ݘY!#dV,}@^9YcZl}IHi딉 !rsg%gg-D1.0I#b\8!)c%)u!fDZv6ZH`D6VլL$eV(,{U 0YF"J #!(nK % Th^)HWgk\C!K3!sFfQg~=bчay"ăs c҇[gۻm 0&R¶9B"^,`t=7WNa}Jm V,fH>k;#Ȑ ei4^-G tFyh mXϜr ZfV\ۮY'?C:C)|[ɩVsgٲRfbz>f_J;b[<\-\-hAbee1Y10JD;3!y>]v6mu48Eٲye^ 93Cvku|™e敟=~A֚|s Zom[bpuaHʭvre!K kK(pSvЂy#Add5_zF͔V)58/T`c!Ŧ\3t[7[s;t+}Vr5L4l$ ql_ % t DwELLŋ {q 7EvX `FS$u|OQfY *&pZ f frX[غ6)j9A#&`4r٘KDg8[6ف-ut86s o^#F 4CPOV"&F8$TDi&% sscZ%!kQ #[ϥf*틥+}m4 0i a&\&z_Ibᛦt[Zfs^r_$ XeD]Rj+(u9h&#"NM'} 2HC߽_܄*8lfqJT0C!83lDTkAι'QjA9v!q Gcx6v3d:JGH|Ӹ \NnVgFRL&wsZiVg`,'jD1՚|?h-ŝp`L(P(vVaʥ0V /ww7Φplum[bd6TCjH"E9?YM`Yd:UxSz1M$l8G칛FNa3C@9B36% r?!Vi*{xN7gvh]?*hoLE1F.zAv㋤EeU_ WӻZ֏ji}uWteԕv4Ww7ۯݿLڃYLfR't_P+ut,t鷽kuzwiIb/K?7ڦwKz*`_m/jG5 G3-7MQܸEd\g'`<4LTk7*i^R n+k= \r&G:]s,$0V֝/nb# wO;o{P$"[Lobl3r`[mDG`,05~~4rwT=޻TA8NOnng~^E:+}^.z/`Mn\Jzhu=5a8:`ºjJ?;j6;_'MT6—v@z6&ю:_YZYiu7ɭn'{H__`X)+M}h^)oLsֹɨ7Ѵǥ$+/_*3v8{>1iQђhӿNV&zeߓ QMA//p|Gq!fΉ åO"fc? .+?7]y=$Gr׏X΅NV7f%qfR0W[re_xΉg/ ?gЉR7=OoǏ'* C(2 Qg%ۊҮ4 坓*LxnsXG3P^zj!{D鉏y+?"943j J6A־z;|N#^ V8B?\Q^Dx"v ŸC&1 6LP|} ؽ|'Ή'ޗD _v,]SߕXS."db;|z!ZzxJ0IN0>;'K B5l, A>ӥR i$`e&W N jRGJgݖBP7}9fsmp޶t8:VӬ^=;\?z[/r NHJUWw=[d qջlջZܰDvL/z\v~Y;^]vT?8A F8 z/ .~~t|z6oA7+{}穛V7ۯ?YE<}?uao;ARiڮ?U/jWIGճ:X{_;?q7I ϝ^5K7Tפ>H]yלv|Z~W=:zw _x:D0<OOh>SZnQŕj>K?8l gɖ(XٟXJ6InK?+"L_lF,oҳpva=(Ch̵!J[ˉU $ Rh&1h[(a1'S2.fah$TU*bknkDh +^ԴA:`RoPcB& |&(0V=BFAN"Nf"l AruFP烙6  `LCkhd)qt!j!U.vc/'Wfh:(ohQs,HӈH Jr#.)p97XG Uz=!ҖGiXUx\8)e$1 sq][oG+vOv3T_!g]ĈS 5FCRq#}gHix3HlKNwWU]]C `N Jpp%!YaAEߖ mVOV.PZV0e(:pTR +\kɦ fm匦jZgMkfӂs@á /-)U {vK.f[㌡yj6Aku՚ QB 1ZkehG:AhQ4 BGN7I-[,dcؒVvta >RHX;Ж״\Zs/v{I^=~t~JfZbu|XxWoߔYYo>~|.N2WeN^n}&6p|W囜 7?T\IЗM}ȥ|:Of_V>~&Ȓl>uUW5 ֏lkƶu6kRp#\qrwRqg72mI'?AF'}]yϸ|ZxΎulrgGN]sYVB yҔ@nTL'J s_U։^&T ݉Nڗh:OEP85OsSGYBg#@uVz>… #"LR3aɤjɬox0iMD5Ϩ )q-S "Bʪ-{PRGi~|e`L }!BkN tpIwiivryygZvG h4,'䷰~BW uhgT+RiVVN,AQͳw rt$XzdoH6{A6 2wlW-3ރfꏵKq󺨍ؼffZ5h1Pq0hGon?ݬ.B%YEN|{^i5懻+dنI_=pűC#k(>rPe('Ak \zBk"Ox lQ&^]ӕ-ĪFaDF>ʂ ]ew&u}9+#Fʭnڦ)J~1U\q~ڧKnZWvF;Cc0 y1YI?ϪjZy~^>}]b))e$“l$sENFf5k/=ў ~T{g䏳j8_ Ut6j 0*D ܣ*H͢I9`rK7_΃t.r9Q ZMvVq EM4|k4{KlV\,N#fY{jv6b͊d"ӺmU.5jwVp3IjЬ65z8pVBVBQ\=o%j!P|#+~]ŵ上xsNɹeŸ$չᔞܡ+Fz\ʼn2a$EgNٕ?%f߰{ڝ\:)B\]Ч7:q<ݞxE v+ E.%## ,(4=BO&os"a E D]lEn]] '+@lG]xJa2+tQ圛 F%b EoC\TS \Il\#[$trIcV]W.x ^Q!ܦ[M/بFl T2 ol,;E[a>J#\*ئg>H '7 2.nx :* ݯ޼0u_q\դ K+jg̣wj(v*1gQ\h6d|ϯu8P~8X.%D Hk^"'ٙ Hړ׾!y =҆jlVm)@1 ːx0lGS8Ue;9;,4vmtlQTuT+Py$[f^$pn]8]l׷'@R2w=U](K*@8x/&qJwWvvmoaPR}[,|rz4$vmb]Ey^uTf .`)Dr_dTٜ]fXmw&rO?Wyiʢ!m0凖-x>˻nr2WUfr]ΘUqWם&sl).|Z\o.=w% U]o;NVBBN8Le~Iur%I)vemDMg /\ P$-D1IGB0z&ZbdIwg#Ԁ=6]Yt%jü y*dzU $+M$'wECԍE>YxXesztK {?5 I՞Eѝpuxʷj=*TUuC 4:0u۫~/Xi+6%w+F{  NU.2T˛}a_An?|˄=[fͿeGZ}(/Zc}х^|}Nl}5&+as*moklKUG-!RrWQ[lg&xh=4)Luǡ=C)Nh+:tn:E٧Sq<R VӰ+tOpr;I~ǓMP{r8ǎ[ő\{<(kp~V;nݢw\ &}w܉ Tߺt<b \4[x@ՔVl'>qca (؇}NujʐBΡ,tU:5UbR1p;AGtB*kzt-]. ƭH`l ijc֍- D*ÄMۅI:4pca`Fo" &Ù1\]% W܂lA=rBږs7^2"YUx06[D)j XBjAu_7$7^codm/ΎF#֎}-濛#UYYҤ [UH A2ȥF50jNB @jR4 ^ (Phsigr|<׬ zݦ*Y3k &cD N}id3#VC [՝q{߶Uhܴp|w4k5X|vt븾\}s.&;ZUWW]߮M4LnnºcuJƒu_a(T"JLN)7JG2b:L7< 3+b_`/jh/ГJ \NJđ%˧;YG-?eǿbr4E90KC3CS{04ö )Y)i=5R,g1˥X0}px.ޞ>[B }MGms~Ѳ7|&y↉q?q'> 96HC􍘭!\OcqePD)r.l9^Sw/p$oO]l71-l??a93ަ]}ԟԟu9{3gdse2E| cѡD@y%8DFhѭ 8AtI+Z].f; tv|^%GL|[$Kػo2?@NIIhkedX5KR`Dv:ifXp^pFeU!jnjiv;`mU|dՆծ(Ld"6*CeA䃈X OVpi IKF#Ǥp uJlu:OjV%$ F;QLHqA+YkS!zz s)B%rv_C7~ɀT-PF)Q׹8z}Ub3B*Z&ɠ\-%ʂ.[r ZJ鲠 V(zK@:ixnL"!hole%<4 k 5$aE[Cn!4בZpux{7/<:s1V5~L+-#p!ۈ)QdGDғP N+<,(}AޝӅwo3DdU//ߟҺ-koų3?}vU? I_g#ٔ9ߛnݢ(>j Wir[OC=(jWOR~JC`$iEۘ?}7Mrg)|rlZ?*yzjE.t)6\L,x4׳OEyI$xZk j8r<$XZ*kQ9z#}u3[OSBG(AWˣE6#NPʹȞEm)IL[4udr4 'KCΛtbj_YV((Y( \l|պz>LN- :Ɠf*\%^+gcBs),8œ0Z8cXrj.0$IIh 9ZH' B28d{uuNCٺ=bQڔMbC HT(='G9g[HDW99Mj '=ⱥ99 tYv,{Rr+cucfV,;[ܢfV>;sLSJ i-lsI;ѫ`LE q'3~}R!Jr%z_*IQqL1[6Yو$+ oZ`Lb'svU:rV]7&М% X4`+]'/Mv.//ԯh'z-kUZ8@G -Gniԝ > ]cѰi # 'ӝ(%%&<;n)d>gE|UW翤r_pF'm4sY\Ŕ 涮_Ėa||iNx=q3s20l"pQprTDmX{j'@5O?C'0ǯiX4AP+$3dmn`抁LFQ\ 2F= :Z8T͗q0)\ -D.̿CM| _c".|KQ)g?n=uGf-/_oBgGKZ0swAA<닓w!vAjK-5oFrdϠZiJz"](h%5F>2'98Z785=J;^0ځ{]< C#mWTQ|'bݴ-˩qEb1gW x"p|ώ=2#$O҅p9c 'G/G]fd$?a#DDh-e!W^υᕿ>>+x,Aʾs4.QkĹB*; \p/7݇wQ(ߟ{wBkֲDOAeã^#ã!Ӈ;TE 䕪UA+S݈Y6=SfUhFdWGw _$dQh)|~ v7 jwꒆ My!Srǒl5Q3.2v}6]P0J7fccq/: ϘCÔ`/Uɟ=(B (;u,vmg7Ez dRmS83Rዩb(+p;,2pGØ|Tu 6FI@UG-[-BGiS6us&2ӵ]oQK,z#X7MWܦ <74춴 ޘp8FӜd(0WC0tz#Z]3& 1 ;4G܏%j<nfޔ_͚= h5y*wVoI^m~ok#˵HrmwH7#*$,OZ$ A_"FԜ#D$PG,F-WW..Λ֫޷}ЫOb)fd6^gyj9Z||av{?|E`j/* V33A$>4 J }"AN>bsq7^CJ]򁛨d#q$K% e_QKcߎ@\RԖ5 _N)輑HH׊H0$!}x0KSﱕؓFMLn0Z`'=zWNYk=7f[ RgQN' n->Ԣ~d-% ҄azK_Mdn q{:L"e$IkT'ZUxmٴB'gvA>J4Q䴐K=_A_>~ 5O(RҶA Ϊ($Pmx<%**y;,d҇$:]ֳeɉ(CŞFwNJ $\v;0 ێH5gQg4IElmCl.Zpպo7~ɮ3u?@:GN%5b9W{=&t!Y$ڲD"Bn"GyLYч{{ܗ`WC(Fc6gф^WAN?ch ¢`:֌<Enc WQ)jb,$sOATJ4`=ʙ2.lp(߾_O&o L3aV\0' Ac' ׌[F.o܅yURϥFm?cL;ǕyA)<ؤum &jOh4Z:M-&(6I+a.4 >:EJ0i68kf5]Hv΍5a$dq&sKE2SV:(]X~93[Lf4];XNFpsc P ]Щywg<,%|ܰs9:M3zqOTѾ_t^8oz`TlfZ. JuTu;oMs rRsm;lrkad[Wb0i'c=h=y3XqkOŖXfdEAܳ*Γn[H0g&>Z)<,RrJS)P&{gccn1Jo )=f=h=m>¬`e޵_ю?SHiMs8i2i.qWt䃷vajۏVjN|:vpɭ%+Ze ڔ,֎\n @HJ![~a v sދDe(Nk戵5sJhIW"`2uMt͘,^:pLrۇ kqE̬Լ_ C`{M6F)AUխZ%UD݊:Ab#y.%#N'",1 "aB4C8% C(2dJ1M%ٴ;̞Mk0_؊/A2Y, ὝR=^R>_w h^M WyIsw\dzz mOCn04B Vԯ}GS^U!)8*\4綬x'EVB/b> U#-RC1]}\hXc@^Bq+>Km#蜰JyZ9#fApg ,!LbIxٛRq)D_4~m^;Gj/Ȉ^/ wb_J~{?&iYX#x~40Y=NS9~ &+hY99i:v',n:bctJ݌R{6'Oy԰u$dR2{6)5߰Yi+b VwDBR3J 5#9:! KR)}\)nmuC]:f϶6þ KvUW8ٙ~6Xk:.%~2zڛ^qov\xiӕ$Ne<5_2*CzY_D~yf"& ޓ6 -C1"bpW{E gǷw01嚘&wߙf~gt &x1#X0>KC7@=FEHMLfwLSGZѹJO_7qb)L0&Xg %(C4kJEG4Z*8^-qw<ԄSAHJ9Jmeh/;|8aC3xwo8(b2]P /SjbDv&{W8'va<;Ytӗ9>kvSkƮ{1*`apV'Ox,t;Gaũ}b.[1n Eۚ-2[p_;dq`T]΁1Խ+!ͪeKP&C`Jy.¯dLf ɉ;S$ ` 02C22TZ0{1q> 1նy]/x: L|oשtE,?y9a ϕܨ3So0BjKp{xgnÿ|Qƣg끼BΙ疴4^C)[.EtQ4=w< ݛ]7ixYZ$mdVm-V39GO9Dhkm:kK FlȑjPJqʦڂ[nʗEj)Wr?%/N4bDrȈ!qtG=#YޗR-:h`il$\!x`åbem 1*F Fwe ]&UF`$*1< f%I#u*PRcF1B)Rs Ȑ <8h%5Qb3D5&AB 3W& hʐXDDH$9Q8(F KӶRJZH:b(G&_RIqZb( D'L`QX/8#%5(`U`d&#t- %#0sR+n0.0ve?P7]L"ÐU BhFW*Јr90v<=Ҡ'w|¦u0$KnL}y5ǹsz/+{ KK͑R4?msr2Q`y חXڼZb@jH>%,#w>K75?_ln03[#M⅁jeYctg\:.3 ;voãYH?hVJ}9?][]<^69%rOYR'{YrFO6M/sTfNFȟ|VS}J}z]h nZ_Piiڭ'UƔBvH^h8 n(ոvCj!!ZzL IG.O(x Xv#%n#04R/G6Ik<]xݿXd4c7[ Hj'{-7Goߎm`9si5u|'̨7%bS،d4頝fjaU:~d΢/U@f?vf5>,7o2zPp[9jvC3MLRP1dZ4ZT䵰?ċdi9` j`\<_˖ &6(h$h̍!Pr"ECC@ !7,HJ-#.4[ܳE'bz}_Hw3@H5OI\^Qc L.,Z]g^{2,~.&#b]NipkP̀|xgg"Oא+< {q}tfimNf!Lq7{ 4s% K/ =?^Ds6ro+ric;z l,0&l^4KgО︡}^ic{6[6?ӗtND|2iGoIQI S<kְa߿4<~IA,WK#mCS FG0p)țBmWLDh:'8GIl^$ܶ;R^N qGS@.&NgڽFXjЪ겇+ [Nf$U1J2e=e֭3Z,F-e|eT7h]/u{OIs z ,k'×zD0H6Bsz}94_J5 UOaO oh.spF!ajIpVxi'DQ(4%K.GW}(j(N׃S[n&&eܝ! ?\\ÎNxsԵP jƎ66kI 2&"n@QH"ưȐ`(!JpA (Y H\N8V(VTKY2U{WsZ~g(vF#ElrԤ}HePͨR?E#{9M+,fJpa$ER J!5EP":S3ۂe=Fsfa: B 0?'/tK!3x4#ii]k% $uwZ_z!XVn]żB*'%O[^!!H EnReے NnSqRW0m ^Yim$<ª:*r}rf-xmpck*=ߺvVϋq*U3ZI He45y py 8"_DG= ' %<-L͉f6HP`ɬwrغn{ )p,qDowݯ/@C'xwsZ D{ A̤\*t {_NUEjpr8' (/~ef oy8Z~_7cٸ/\߱th/_oj)0frؙ;ɞ"3#lV{?0Xm<6֩w6g3zW*WQ1hM>n抁q}G<Y7F֭'ǔ)Rk6[9eᬖ>Λy`|2?bC!BE6). yߛѡR1׏Eť}8LF5_?S=SG yc_dn2ݒLkIwAԻQOr9~U3!Cؼs]A52Rm'(ܯŏ,33I dȽTljd޽ЗZ$]L;>U,4r.m̀(%}z`/E?3yN=y2 @N7'ɩ8p8g=^f)M#]m]"P 1P B@R%5)t .@*^t|A397*\`vѴ*V/p;.LJ Ux[m݋֛Yidc)0|zI K=||iL~}ksG}*'}kUX,W[J>5VQ_ y&>3 7}fEz:퍀gdK)AӽO"<R"`^>_G/ן1pr`H_97=us7fA5Ę M6 Ekvol:"RηM:}9T|t2NVoCYl L:"֫z53J_V;]e%&7J{-҉y/zZ륙S̒A8~@ Sb&}{oOتf.dר/XhA6 zHA,YIAHTgy+x653:rZpa !Ѧ:60&.Xf\ {cm%e6xcuX;AF¬uG2yT0T"SS^xQh$8+ilU6B5WBb]M0d֓;Ҏh2zY< ,@@=wŽDm^|gvcm<uܔ1Lč!\jFJL3KJ |+6C d %IbR - V STAp+@KHw\EnQEnb\q(A#_';6#nzD<4ۚg*==пJ ᾄ,XE;ae"ɴbQBNB`Aa0Ti bRaUBQJNmG{S;ou&?zǓ5ܘpg9ۉ)Oef߿xq%ж_3>3JlХ#D#Ȓ 4YM8p~ò( ;܇A2o 2PmH 8Ba@4E,H @KB+1B'f [JB 9b"D"$I$ $'J  C$TH$NI`GF*\Y~g8p -''k59ښP5G! "L[ˆD r*1"FX<XJd>B<'V <t<(ڰ]xxr] /VC=z[ nCj|y1BݡW as-FH%Tij .%=LG /-@so<5r-|>9uO]!\Z }lD](0%Ǜ*i9? "apFBJ(dr7ήڊ5v>IzqUXx'=tXvXأujW+4'v4t!z3b+i#ߞ."vm7oV/-_mѽ} sGɸF.HˆE\z48T)L,~g:Sn03YRvZ n8unAϭ m";[+gen%|d6N3TVN|RhƠsшp'Lml,8qLE^_g6Jpm e8'mc"' kDBAI+ģ iMh@'DHEcRߑ5#ǛֵSz^b"joQɷ׿ƾľ~EJZf~o^fNL M Qͭj7mm 0WH$z6}B2:T떮Wp Hu| bcnz X^!!UMJl=CD=\ KA)~\:Υsb]a&"oM65rT'+ >PKKx 3fe2ğQgݕE<]Xu/ K,@ƹlAGjXsdSCX0$)[8逹eŝ肊0jwD[:p T4 ^`ܳ,-uA}ϟl/]m9Wlae"O tױNOįϽ| O5eŞg)}7^GCcٗ<<%̯s] .^f΢/LHC+ӭ. 2vh hv@{9]t {J$֮rA~u; " YΟKyr)y$hMj$T( X 0f" H"HcE "$\f<.I)K.fMQ{j E8 99Q#CLe„T(QI8 `@A?ӿ<Zŭg 1C%Ij$,HXH\J=b$h`@  \C I[j's{~<q26I17¶u/Zom{(u_>=6 kס>`2Ea;3u3d,ҟ5:j&IM]Qd׶$kb^iSt1ʚmt1w e|ᰅGy顑;l[' >m[4LJ'EOzӕhVWߖr7F!-0MvAB@[6p68D%Cn`xwP9Zv4Qb~.h6ob-aew.v̿c@-76sb sY 'C](˛CX+`Nw79KȻ6D)Qۭ!MmW3ozÝymwj}=vj{lzq^ϠZP,SRM\ƫCgMQf 윏HX:(Ã36`xOV>20v]>[p){!~QE(F;'M8ޏ}USk/4m3 G[=sVqT樖R%x`—d,{cC) X>l'~Z-("D{j'v&<6ZwwB8{}t,~&lPSDAM*85fԛMNqdqqxF&@zYMDMNԚN9xa@;/|-, ě^S0+8N꿜jJ˞P~O5;/BQP&tE+0ipe\ eW`>[J F?/Z:OO+pVHO!JNO~L%V. kFB~"$ST^'j7]:EQb#:﨣[x OքEbHqJo6[ֹҟJFܷ4HL꾲O- TLwQYU1-QC޾KO޻my68AS%fFRuƨRSW&7>yf||>-!f\RGA4; ,& 6R 嬂3[ !Ue-J©e9E:vq'3 yJM~w\lfzs{tidMł(^|YE o@ $v2J$QBsWj!O/n=M+Znr;2 +ńl QdG6$y4 }U;\I&`ȹ.<g>űAn+"f،lcW%WEjS] |6]ifNgV>CgB+"?G{ Cw.[Cs6tp1c_kaلٰ0Gx;^[(C0,=9Y:InRD-C{yG23p=ꯁc+[E:ˁyrI[]Qu_Q볡 įkɋ giA\Ɣ0oyt-:n7p U\SypFޣTﺗH?W't\?8%aįQs0=o5|Ku"wFkE;} x/i6Wb/iE'W~=)<""MM6P&Gؠn|z {a/U!9*{s /hjKY=fn끣`rI|~uQի@ݗO^q9>xRHT5^TֺYT.?mKj-h-NtDPeRqKLJTk LruS Ibb%yDhJ L CNE+onʲTW3c"#i±HA}8!RgF[$T8DMoKEf I,Ss1`tB$x_sTqzvOףGɪC r$>ۗ/3G'b;7oi?'.ͧ7Ղon@W)H |?wf|Y*ߙanUƠ~{BԘs3nF}+*ӊ+R)n-M2JqF'gb1eQ3pհRB0XUl4IcSf12ZQMJkqc ΝޚkϓlD~xIRH:,G{WLy:,~ ϻ_[܍Ǜ|x11An 4'mypiY?!_a1ZIA|QWzԂsũՓe>PHӺ(?ioCJܸ1v8EP;$[Flnnj9b{_<"2gV&5_=IHeS边!zaJ!=h8%8HyR& IRGG=f)0R`;D4&utʣxMTD9ƕ7q|~Pz %IʛO6g??XyJ$j|Hb2qp6kM1ܰ +Lg'*/C#Dd,IJ>#+!_ `M!&i!5&,E)IIXi,`DX89 QaR1X>U$rx+*9D}3C=ŏ͔fD^suwL6NM{DҸ 1_:nyc?,gATnv ww |Yh[&@<z)j\7|)P5zmP9Gl=z],u:G)%,NZ r=B&۾f%Y' *̩ mM/~Y Q` nXKO%싐> ސM;;F(JYdFJݧ{ JdXS.`J")("9XӊNVO^]XMEPM)B!/^qNW+5AfX'̓lL(b«RIHDRLT2NS%063&X,c)Ob FQM=%K\2LkQ.Ӧi=^RM$on%fTK# [urJH~3_廱vs;v? qW:/۱' 5[0#L0M,B(.M"S8QckaoKj5n>Q@=e2ӆ\^Z 6a$?onFKGIEe0Zb7Z$Ap^ 2fZnj_ơp.B}N sqp(ZЄ$d@;I)J)6EJ("(̅dG-f%I5sB_&ߐ,%F+i&! kbRmYPL\TiK'^'I/ \XbJㄾଣWhE~qc1-%WQ$g=Cj.M k^(| \3%R arM؍_> rD+ǽg&+>kBICzŇ,_4KֺLe[/l:}[ Y(dv pD(/z.5c!km$K},s$eY)L2#Y!cYKR'ΐ\&ыH.LpFm-gY"amHK,2Ym$ZC¶3k5UrsdơEuǨٳ;` ua3?mKzqn_9(TR1:Ֆ5op5#P{V6q8o ĕ=ͰGPxk74hih] \x[%+?Hl|='BS_S\*tCsiAS 3!L %aZl!(v,5kl`)xLJn;['aQv՝ko1 \cM »q(j?N(7EKȮx}?8{Cޯ`X,Jb ņ ˕ΩIk0 4%J1F|r~gݵdrF}C z4EqcQzrZ+` i(rMQkOnx wE]ӿlcZ[tݲwۋxSԲ\w|\lCqv@RacmV>'e18ʕjdCZ`v}\*^y >VS6)W=[ ~6[߅>ɐא$KWOc1hȹj rW17]4>~@rv6{ԨYyۦ3P̓ N Kh,lBcΙYM)nPiT"9E9nFn^ZJAD3v % }H[_|M/cԜqS VX×`s AU|0'|@GjYxw=0.9KhFM*ؾ|vzq gI d#*(25#+pҴSo ڄ! <]Wh ,hsMwZq=o'EףV*CV!Q&lބn "TtyNʙsi`O{THƚ:QkR-)z&ufa$+S%B4v} bWaZ"qAS91 sXjV{ .drJ!աR*cHolr jMf5yʹ*!2/k'DEBDD#kC92v}0Ŀ+Alhf[2 ̺J$ӵr`Zt&Ӵ\K)Sp.Q&ajNZ0%_gar9Ҝ -:\iS(Y kTj/9ek~ g#w^ixjLdw=Ҩu3_,Xz}Sg58皡nԗl9v_e:&a~HvO 8P{#oF;6E*e**ݐg#7K9ByۄTgY|DH} )-10^^Clۼb?h`AmZ/є=+EΉ%Ncf+ 5>(uޠ dV9'l haBĊ#8Cj(ڶR3gD#m'z~n*DlQsgx s!DbqQ\B>')NA'/?~lXkk0QQNQ"&()NWd@ٺj ӔpW 2ڌ@{ǝB4 j:i!P$[XK`k([IS >{YOp;/֨"'`g?}Y&XBX@hTl /#Fs5L?e_eY>/i -d5SRo9b55&O߯^{=7t>VH`@0& 4 D$"c:R0F  $S.#Gf% ^}G%}YMq65dNc?Go~y19:-7wzXu;a-.Y`N=,+4/UOc3YɦXBHHHQJZƼ/rS]@K/ CJPTR2bvR7do9E\ڽ4euߝ^3-g+^딜re6_NFֳ/i.1`ج2?@yUo/W]ffɗ9dz &|`^0=-{&Yz&g%fnxz&)f66Z}2PFz ΊtvX.U>vUhD˂t`k;:r©[қ/ny};0mhήtv@1Kȱk#:۽C%*>YJaV'mJhT%j|*OU#a (FLi wH4HaDHЉ^zٽ.vb611udq-* 3 !'T0ab5H9BA!¼g jnNڨ^ޅ"̜MmCpzLSg85M}GYY@PcUHe4m,>NƙhؑGdH(uԧ6(uXx Ջc}tZ_WM Yp4n;Dڐt) 24l;nt;ʼn8u%Am;G64_ ACcN!AmK'C!Dc>_hGZsi-y9RݓFŮtFR 3*T0K7?Mڬ#5d\I 40H _BF8\CSƯ7oѠ P 1%ܹzBeNHTiکV]o5F"bqđ2 j2&YMϫ$8jVz=!~cjSz^SN?GDͭl+F4MRKǏ| ;Αٶ\~~G{*(jJMJxħ&=1 NRU#OmkMSs%Ʀ*\!\va)땝^'bqeL*FBܛ:3$M?XpˣI0ڹh‹[U/GC RR6nNZC($ qk8CC>`̺6J]!_ #Ma S 0`N]aؖ  4,l7509C^ *፯Æh5L`wqJAlLgRe͗#^f34ؼ:K7cJf ΋ PZH0eCB\A9nl+']T$0ч'(2ֹKp>CEl$Aȏb(Y)| Tk'$8 #1 c@  " cbf(@eTo4i AjY.5Pڋ "%tv:M5ʠ[iruy'O )cR ?z?u\噏:m³^^[܅D& ELiy>-v,m^I(mn jLW(u Fž+n v!0QFqؖ2I(T $N@"ɉ Q,c.i̒@94<9>tJB@:BB]g`woAF4~ZN$bL%Rb4#gi5 %Q B!P B_5#SQ:Oq."^:s~ԓ2˴m^ˀZX}}}"57`~ŕ Gߦʗ_O @1\7,|yؿO8Ϥ?33~UgtQ:Q-f,Fn7PV^-!YW>q 6$1 G !!%cR3KD0<!B VC4g: mFv:}o"ݽDC{νv^Ω3;G9.p}cƮtp/@UĨ t@b?Gp]wTAMæHUcEz$8Ug CQyҺBQ.|{i;ԃ#A}\Bavob%lP^e!AEZw"VObVdȽc@ؤ^nxܗ[H3K:c49 ۫Z-S,Tn|1)ZQ饚Z+hٱRJ Qϔ [Q7ԽpṆn|1N1EaF5MqcuǤ9qSs΄٬i`!SV  ; ڟV9F1~9ixӃCZNq"aиB q Vq8f i縅S(xCP{0)$Ԇo~M;.ٺ=`K01I^ivYܘq`Aܪ}1|]+Tuvƛ>zڻ*Uwu:ǍB4c/=B}|]fI7Γ=`JLmX)g!BlIM3k,<죋qfC^vb蜠D_Zmu ԅ4\, r $7dAo_bBV.󁲼,Dci`, Driiyɤl`My+Ne'Zd}x'q"qկXQv<;!Kpвiwh>fy' J& ,^al=~r<;!gMU\qk+ގVS޸9ִwcXɪrԬYy YD–*}ΞC$ڎ?Qda@<ἒXQnj~QWB=> pɌ[ 7C:]C:]`3X⮔j5ő©& `씙!ZWԷBsNݷ!WəR幐5 m =bN,ذlΰ v4j0C#P!Xa5_VUx٬ĉ 1Đla{Mi؁iI0ɽ,{&XJj N2O0Dȩ&LzS8-6vtgt+K.ic!9MHb* JoEA S1]_U% ow- ]$! PF/BA#(| q0D KiS.(gH2̙qrc"WRFq WZq.C'~@8XAI3JCl),_륯,6wN&zٴOz.E]T@3m?A7ׯ+qѷSd{Z\@l ؜r?jlrpEˑLc>3W7/Ϫ0O6 bAߞ.?@DY:|Ws">#夎Ԥ3(d+nӍIĘ(K*qfOiJ1#kukAN]!\N-wm=n~Y~X$كf Pκ}l9b!ew[)d2\ڴTbX$XvƨsIlse<`?iF&X{;~SéіwgPwZs &uc"PIF6z^{hAa:V{.@Sps^-F [ԽLkոR5)oE]q0z6J*^7>TɜaGoB}7ܽ;鼃}^-B>}H{k/GyNs\̓}Xp`#xr/AʼnmrmQ7֗{*T6Yq!yN^nhܗ-:_!LwU:Y$^ͭ`x(mIHHk }̀?hѐ?AH?I{";PB{= ԩ[Nq1:p8 {{tR7#4oq3>#'=޺uVfQpCvy`/hjiZ@.Yc3RaGREU`k: iu5($gOYB҃=p8q&oT̠M[oD; g(:! sn//(<0@uPhhiݣ5qωtl>/L>nj<rL{LUP[®n{ *ng]^XY-S=GI=οW|*w(ewvϻj}mΩB&ygXӇ>L\tU/&{3i:rkF <x񜚿о_>2NYY]7 ~=YW;֒ݘgs,݌eK9Pi*]fz]VZ[3qx&rOyeZmKUgJf ˵ѐ?:Eb`:c: fܳYZ7W:ۼ[DzcFTlgW/߻٨@\!1yu?_٪3[0?I`3 BnGN6_-X6zp×GFC2ˮXsQ\]PtNHPѪş/WafKL힌+y{! ݹnTqT0ޙyeNR+;?=ybܣCiItA?[>}6e6O#hJ_=X~_n`Y 3^SE \dB^OZxJu47? {dfؘo 5Z]x7fٝA7;6l;3$@{ވ {^0BBhv(/n37F8GGhȎ(L33fV1H@M&$Ē`bؑJbDFӰ݌&3Mn$a}/8$SK@O iGv~^I;<Ŕd@J\&9 )l#D YEÃգ=Wnf=b"zhx*BLA6}3`BJEOo5zFe-{-AƨgSfT!X-h*W(oSB\齊gŽ=ijQ>v[!'m^VK+e4ϣd}umc :Dz{/:O܎_8E6vR"Ŋd,Ǒ,$ 2{yFOjqq,@pHR+![T"OcR N`LOM<iU3L”0գTȶ.T˴[g8 'O$’ ]'b2Lkę+ʣ+Ƚ]9}s$ ]u'No9mg759}\JNrt-F! 8t)8] & &fz-YɪaTkmMiG|O˭KsO{9R9|ө ׵k,"_uZ LFk!I"MxDӫ>@$q?ɵ(k6PWqdC`S%ݹǰ] G/ї(,]##w@'|2jmc7&ƧM+Q̱3ByMz9\D`O+}..n%}̂Y^}.YнHKe|}=4l9VMi8jHh͌-}#mZo&$9Cj1$r'+д!}q7 })ޚ^!YiX.<=jЦԇi ׼BLg/xhGhخhYwŃ-2P2s2<Q:3TȄQ{^eSr-ew;!Ct@CJU0)-<@/_hOX<)GP<nzH4WzDQ"3%PfR)s_ǪrGCPGBRͿJm=rK'GnqV (: 'J_ps>/4pk_՞T\&O`=&-_iWJ= y%!ۼPZs< sab.R=Ll`a\ppI`4d!-,KLmOO!qYӜ8i")R82KTfӜS'"EQNgϬ_2f vOXw6+Ȩ<p!OdJU(IFRd4 PθQ!s)C k % 0LY*&x„1EU PaeHƙ~_ymsO۲zHv̻YFt" W0㌀4Ǻc@3 @'P sJ$T $T$@ ]0W*0Ch a#g㵗›Pȶry\iS3S_Ӊϐ8QytpGSgؕlˆ#ϓYͨч ?t''%NgOU67O"QAp*qC8WS۝DgUcR@.tl3e}9a8Q(9jqO5EyJXDE},Q,8& z6c5͋+N,!G2.XQv^+&R23xX⚩e.x*nYga[Ogf2Id5h}e!T/$<1M{ (9@0. 9V&Ch6LIPg1S D(O<R[MG{vwYx.3)x<ҡڗH-bЖ`p<=&%hR=%ƣO :ڸ,ZގWO*쀀M:`ej:mZ\0̪LxWVzƾ?<j'h3jɦqzE,,V"KۚFOAX>Tk7\#W&Ў}Q2"q)bLn<䍰r5LlAMc.$3㔂< c"0ANAE0MwHݓ36 | ` A|dn~dKZfӳXfYCEVA2[-D,@>2IǓ[V4Hn)k[Qs4e3vI*Kv|PqlrEc_j'elChsyj\ Wq' 3K5?yT?6V'`P;.?Zx,1ݣyzڬ/9( Czx[2^M쾳w6jZ bsTQz-,vg ,Y1J۞IMĖ*+!}W^* Ku%?z3gcg曻 ŀ.N ]eĞEj]n] j|UH__y!sHmF&trjp vrZ:c4$Lm;E:@-@wR=Q(* QԱ?IA;AmBu)6-AE]Tq6u;sފ:Wu;䚍^H^[*6^*bDtp2aDqLx֧nİ1fuU69^O\wƫh`jx~GeA?kU#4{U_PHc_,&Irz<҂zЏK*&8󨾻#~eDa?zA4!zY ^ $gݷge4oߴ3YO+, xp{:G\>U$xY=ލstnP(1!9“oՓ`棆A :NCxaA@H!>mFţ^prEuIb@9R `jjl7cz LCYlI{A9 GHe8,a)PfL~F}PV%|߉e> QFZ{pOӤXxJ&TNVOD3r߰A͇ͧS˞_* 4g8M.R#sNm*TE؁- Zf8GMТu/JkbrW8?BF~y,2V\o8J̉J+B[ݤ?fhLfĔbwd@RI;+rTVF!fڒ1[.'ߔ~A1=fP^ m'!q[%8CpnXOgi2u_Y_kkp)#o-r6aTB׭C#ϡr~Ϲ[u̎fy0??6upPFm5[nIz~]bXQƄ@GX٬gSw˕[,L4IQSUԈdI2J &GP "gNA:wW:sw^7 u=yT|a[&ÞEL.u}zɡ^}zYK kQ:^bѹۥ6 k}Q67fY%wGk/O4гCm3e{jFW.U %]2۴}/CO/\Xqyyyy̔3cn{<~qLۥN K7ף}Dr!q=kAXAX^"A}~X5'X-P(PP{}%a ׻Q늲OwFcfyɜ xP>l+[@œʋ^Ʒjx@ =k;,*.Ì@8mP>Nv'aBWJ8Ybb]nZ׸j|՞j 5a #i{J.֕JV[gx7ظ {b]V]95o:F@@+nn8_bc]Z9 BTfDP$HL=*WR!J Yγ)J"▻D7Ioj^Zɉ/gjQ~Ų3KTiZ*ò-6W˗U_+ӿ^אLnFj(}ZV\w4B!2$}(\1:h]%YP[c|CeLʭҮLJڍVfBD抁q}Goˣ9e YP[c|CtoLY&t 0.\e )$g@JTE+>Y!`9'`ci]3xg.;gdO!%ʼ1AZ dgyip6pwkZ"3' 2GD% -(ˀ̤HG8;I#L)!8)R}gN>e_ČU\Jɫn=_V+|^2%DB>~Xi(f_SڴU#R /kv>ʿoiXjgfk~yzMͰ0co<^ R*ǐꕛQߓewxKX_O{tzoAX}( `0a "HG:[.Ұy` x3. Y:<2XW."3> zNL`ke'tr@#H'nGȢI9mhrx"w{ ZwCѸ_|R|5nln7K3 鈤hbBFŠIah%"eD^$9' l||gOAHmϸ˃h̹٬!MbP{cC-Sv~mO_m֐&]`v#լRML[dsRL^EaƊx@ڣpi+Ngx+j C iM7`@M%ww,;$CUi6 KTk}o$%#$&j'aҐN1I((?+ uN]C]0;Bik%*Ի%PPkG` b|R-4`L~:lTaPag|_>X~Xq0B̙ȋ@4/;S{/b *p `]'c$C(g$+i&?67+ueKgkn^FzcImVK' &:>ĸI+)['W/=&pmmk`Dl੾y{=pe&jenZX_1_6DDh<ӷ88Gt"ega TCea$+=1#3rVCAHt#Ƽ}3>)vj}yjb]Yu7bhZ.û|8֭3zݠQ\:9oO=G9W8y&X  u`ݛ $۹MfD0?t4>$N gz:wNۡz,CL6od=R\=d $2{(cDr r.I͊Q:˿^L*[x!0Q>jWփsԔL<Be 8ZHFd$K0I9Г2Rv=</SĚaԸpA<)A;B^6#qD"HƲ]r'm*h GvڪYZX*y4zPuAL/DZ&QCD `0'(@`B©`J ! I? Mwײ_W֝jj5'өQFX?{8ά/ a gtJ%%=GS()6zgl*~_X,>a&r7&b>4=͌- nH iʑwSȑ 40.{)Q5?󔅀p{ކ)? 1Dwj%&FL%BH23,L @I㩛υYpljPjcC6A;}6]k1pa4&4IĢHaD,81k@(86-m=q AP R=H)PINXD3DYD2!MRߥWବZZm،qx@nX:M3k0<fHqNϽ[vj˒_gqmbf]^q1wo+s<p >@ 헻jC;"E@Ɔ_ OχO`)2wYhԋΛ`NBhՈ?(2)1Y1B 0nb+J"͢w?mg~PϽt6T?+ի#\s:mƜŁ/% c+pIxM#5YYw=3-%T!= v1#,M0reE>"+#r?THV/C;Ÿ]^Q8w~Vq. M9Y%+ѻ$NbR3q~ z6ɦ"M ,y$ X,xnb% ,_SǝwվKQ5ĝߕtDY$9Is.sK q>:F/' r)ʺݗlp`ν-$rG@ )S ^?tc@,uQ Y,(K] 8$K]2 u'&S'uH?v$ :sua*قag. w@Tc]П/Յ CNm/`匡UYvpA/>/dVSհӭ*`tqJؑHTNbJ[ A[UTU˒g.&0hv+`Bm (+ bʺHk-7W[g?gFjxlLW}^N&pl3! ~{7RAAefn/N8QwnN  LW%q'ʴ(ݸ[OKC.NeC*wm JO0]/aobs@s:sZ lGH_k#'sU;G֟Ji8т VB~Ai8_#SZ 8 M$3S;oeBIN d(MBU Rqa,@0e#R)Ҕ2 !E1DIH,H E\RhJhA$4hR)%x3>3m}_րt8οU[a ͢޺D kf1GchLRpS*،)C bKbJGوQʢ `ZTQRЂ{*卬JQH Fӧ K{JV:{V{ie7{6|sufk " ;%@}r,ـ IOD\%%XƚIMWQ "cG3 **U涚6u*)EoϜ-R8<.fVcdzfU^CUl~!᷏?{C2Q1Gw_̳/ñ6Ro0/:?O+3VfŊOȌ+-g3ٟ3#5_ʽ ;88Q> r~hAdX-vJE`8]F/u i%O pIނ-Lxk2[> =W6|jpK;F uk%袉yymDp~cg[ t@҂-d}>3tƒ/Z' i@>.zY9W;Ύf9n>k2aާi8ɿ>?z:L~WiEaqBxq֭!oc̭{_1s B2狃4 SBڨc^5mVN{kN9nOPո!BhN 5aZFu +>@Hjf9NU,Ng~IWj*'n:}3yEFHS?5iU$wY/1:רT(DRXgF+F2W:5ve4~FU&]Ώ@&f:RwNߞМvPW"̠ P '(N'OƲmC7<ir֛L'?6!i4A!(8TN'BChQPH N`?}H˅c:' *HP%[B9)kr])ܼlK!WՍAmfؗEޓpb|/QwVswu-]O*w|:Ǚ\Zږb#$rYY YU޷`?_^MePqQQ\7/013dtQ&[6oC*ϱ#C8/g';ѫO_qDdl:.A"Oj.-k-OE?Y_ه\H9p|S4wQ oVkُ"=|ё5])JTtãsZvxGnSaG뫽b>>lHP >{*J#NE)!d_EgHGi;^qփm2@3&j1S&(bkw`*| HPNA&$}#nm:K4LkbP@13? 4 0VDE"BHΐ8bHQy[(QX:`I_cB$& Aރ;Ƹ&C92zIˤiSqJugmj#P|w;z?0j3!E5J%5鍱c{m1vI[HGGT=5kӻȂ/=Q1di%ʟpg/ٟpG$Ǖ*rk2HkaDeDp_Jȟ,M8ҔIaL$Ni:v4Rۀ -I- E:8 $̪''j 2cf4ĕ bx+;7pevKB̊Ñ+' }&f[M ]gcH:;#?Y͊ӓ39xvFO4rɧdqk)b5>(L#{>w1?v>oP;fsG:CasvpsLVO n090F<\) 36=δqlר`JAV=f+]#zO%Ŕ;_p H^cQ5F!Q-{ C ' 3kȞ„h4R\+)i]ѻFg ϊ=' ):GfvVFy q '9J+,L (WP\ (1/V@)hJ R$ִkSey[80iIs'H@UV|cE@C/>A]zeMV-[x(X^$[k ZaLX{@-*fiaIu`F˴4T C+FIZ Yzf:(J3f{i/I<)XD^BCCRV b:0'e1TŒܿ}j̣|OW#eiɖ co k)bg\H.+~3vApG'6x,-oWi{i_66u-xKRɡx,oirHq4v:M^DJʕF )q UiZZψS-Uخds}'% f gVb"5j1ZENILL}Obi@ɰd q?M O:#'H5%AӘ1g-r[%aֱjqy.IVpfa*ݳ^#0fI&T'4mZ$- Zcz744]) TH@4 0&ԃj0G#=S`N*MK`H.Ta|e=0tFMb$RzD5s) Pi5X}NmPh0 %:$B{Vd$ed2d-D >Uziu-;IKۄ26Qe]H 9}<-CO}F|_iUFꞵXtV"w;P/qA%[z%Sk}-B2]!|s1*(53apUڽgZG{,сYzNGGS?1c$.f[Tʊ+*T{Ss+~ofq "S}BŲSui9#WjMbrⲋ'#5!檳0kakIrfl,5$LtנZN*ݏz9LBS]˷;g|0ڊkp`F]tZ ~gv-fYuAZ;u`J:j, }e9Ewֵ+c9WtK@)t,\D%?{f?u{[ŤN\g}WwRYfbYrОkް}p%@'Š^ k="4?] >XP3˰%#lH.C|IVNƧR>C77>J#@WwYWQ1⅚:]ٱNpoHJN8* 9F\M_{;%)-%azw'F$ 0X.H%3k,"dx7Sۓ41=F^ ճ`UW Id%t`+ 0JՏ,~2o-:LUj\s Qu&3yU6 74J%׀h xIVzGp{Gneppq+^ۨZLi)b>Ty,J dT+gBQImM}B \?cO4ՇY|Mi EKs֊7k|z^W-й"E5PZ#0ˍG&7sJpl@7RE9^ƧΣ fi/I!'5'wݧxUw:Ft܉p\]H}ÿ{8WIQѸ872erFf"%rɳMjg%G?F@gxwh< 7{ 7. ?]_f|9So_vNe pwׅ_;G4Cqڗ{!awga.(q }3&hp?I~n`?;_v}i];f@<oսrڣrN5v-!*GNϗ JeCf^>{'|>m:d6D}㟏^O?f._ri,g?틣篧M/^@ $r4xd]Nz3v#N{v4WLJϞ/d;׷6TW'qv {#s1yޮwÓãKr>BF|w~.~,<{ ˛߽?0YW.B@ V㎆ϻ~x0O&gO0ϒIܜ4ӃO`ُ!_TR@sn+4S@lRvP0:h*|_v^!@'^nz~ďϲaohH LnK)ࢼ7)˃ͽgbfn\sdl|Psŀ sŋ0Hg9E6tt^ n%VýwO4XHw_ǣdW ~6ėR7>^wu[3(S)gEkǣu-XS\ c^ sx/ͦ;?{_v^O#Sa.~?6>9|k~M/8V# 8r#ፆחs0UɳZx7QiUzӭl[}oY|k1j 鍿Y ܰ ֮GS8ſjN k׬>Wo 7>fw3yBfUvrԚTLjkDVAkl4 ?{4 ; B_e$͛VflJ 9QE9/u&[{] V+ZIeL5&I-EX1xD'>J0ΥS4Q,IfP>h-Nsp38Qv\0{%[,BqJ"ÝiDDUm6HR$/dgal<;FDUg>\O|d|2Ы.&*E |uXh׊ x8 \&,_{ gRh;PQgx{k4%>x>Ch+ 6Yd(1ۜ.%<cmGmN;Q34X #}(ԷޤJ6`v`S )!0R(uLjnSrb45wA8#&}Ar2_)-PAdb]*Ƴ2.U_3(N^c 2۞D3 #uSN&϶2U{*(u6J<#bF)C  %UivΧb0e G/09&l N?^V B8dDž`J2S"P\ <5ȹ"VS7dOvu19 5M'hUM>oCŊ)gs/P V Ffg,6Hq1fzha౔rN\RNE/U!ND9b8jB^<(W2tb`y0Cg+wuq[^Nե}qM;JSY^ li${zRZLm6h.ڬtŶH1kUE|ׯi;\%)(oݙi[-p11\I+#E5>)Stl} ryPw-W DB##Lt]^WH<\v n5/b" nkm`]lmb;τ{z BUBfR{WNuf0]aͻA(.7iN]lhZFDT6LcX݀G(V%+ᆏEPK"unEJ \Tѷ< ^/3?|%PdY c8qee~I '퍟7~I{ShaS1tichDeN&S.t.ɸE8[._xٽe(ۭۂ]Fzz]e܇J㢏]AЧ`ET =zRo^d1P|`ҭ->J(R5҇$;Uj>X=dRb)#ʟ,g$~pYt/Q?g,]k+c"jU5cP.0V3o?(נ]XJq.$#(X@Je8fq4&>:6vL |%8cT>eqYLbdiFa2eM+5]9i%,šC牅~i,nNrƆQraCxX,#}7H]V}ND2r&_~aIy#(yU}rǒb1)9G*ʼ"yy/P7z\pW/S-V1m % U}EQ5D'S^ZY+忟FudX"Up^J`s&)0I ଱Tz7)!D3FZ+x x <ف!lpZMcM"X(bGN6~휲S,WqRV>5.ǜc,E7a#Tj:cq4ř%XJJaZD#.R.^˃N+*iӰ|||u^SVs&@B&m[$J<%>$o,\{p1&N;Yi4>g0qPR&vqjszGr(ȱ.(-0e F[~TNQ6o2]o*<4RL=ʿ |jTz ZGֽ^: K;9Iu\e4x94_eWsP~H+E \xCnz5WD 1*b/džK8Dc3Ͽ]i+ԞBID--?q*1'`>~9Xj{M(|,+݉ )"|ga9my6r)rJvÒ{s; %:88k(O$)lYT1L9g24rؚmTu(Yk^_s7'^[w`s8ǙM&9Vtks>5x";o.e5~ѭ?,f.,"|o<^^\`"×Jjuq8mpّGy:(A.l~Z`g-~?$rUɸtꁽyc1`O{hgTZ\H, ؄&vZnjpV`̠ hvX!HN %?뛑$%,ɌP a`UF</F,%BdA:'ߪpobgr2;| 0Ȩ(ɌRP%cimf(:INu==>@>GЩ!UxB|HA}VlEc=&"aY0rH3B㤅MZ1N I{AC1$`Ta҂iSמ4 .L')LI1(̏e3"w~|(5sH1d1RRyJM L KyFIN)T2 UQ4:Jq3E]"̡'(S=1ԈonX{lݠ8feP^W ,ǃpA y6i|n`nqx37j-0vٲSN-:kXוa w8F0~xW-[nd1YQ~buV[2bP>Qu[q+[ u[5&j{ccXеn860;]ѥYf6j,"p~8$abmü7_׎nXE7,$:W[Gw] Xc-UKW816x5 nVI% >`lA>,|YQ=VWa9ƹnzD5Gz߮,{-i\q|uDzVD\o ~XƺӗR0KQm(OSySϟ҂:z4497sAG KG\Fm?kvofzqm{3vF :tyBmB- VS6UO<P”=DZŝRPh=v;ήcuwbwͪ%?[^8~Bw YiUm`@!ic’;n]+|[l" 9ѝ酚U $,y4yǿ^(aɣ:X0MXx~, %1zomas[x]2ZۢgSXgلUlQp`l+IÒGd'>ULK l >+;ܨo4pJ`&bBRܨyf+ޢcivZ<⻮ovuk8 "+̹L].I%8'5d1 bVLHt^J&-ƐA;Qhb.T`򗺞yc~id@'?;NՄzY|4lt1'קya_ήO'gi2aF_,]%)ӇWVamG m8<`KbJ)K=k=l'58?tLShC¡2q^pe5Yd>Y'Ȼ)Δ"|1b"KE vwϠ'Kn:/h p4lI6!vE+10`VEYLH>K%393n@* $ȕR0;=Z&\Pi<@=X< u+={O"izMi[;&6wiIW[.kp%@*tuc hAWǧgB6#X]Zo}&fXj>-RLnA&l,:qu)Z ]VmàܬpZ|o=B5!f 2Kl9R]OoMýi&"lJx[m!OoSr ,fܗ}14JRŸ {x@; ;ky:.AJO+Pl7jk#:F[]X>jãF \=+'vIΒݺnnі|+ntk\"uͧ%́d~g/윐RCs~I8mU6FR7"gZ&R|^2 F_eWJt$\Uh!n2FGe,Hj[r=)(cfprvb2Q}w AD弲Η@ڀy큾6MYQ!hbbJSL:dhUB@QZ$J`l$0 'Ir_-FcvGkv9B[%EYQ>飸¾,(xHOʊXb mM Jd6uM,t z/3郐H0G t1i4+PVl)6>ԵAR,|,(-IUD-hQ("RWl lfqlW`ցL|Ц;L| u{ B2^ؙAԆUHݜBZW\9"njXws h0 ݌L~)d恼/ehPK1md}ĪMYK ]j,Čj4S8C8J,U_~VauWS{wY7#W1z|^ =\@ቒ%}VRc~jZmAK)9`Q6|VFLuHjJTBt\&*mT]8lw[A1GM>zF,慇&Z{Rkx 8ZmSSS4qĮv;5ZA{P AfG&SR)wto#THAa]H Ҩa䶃Fŷs،;o5*= <G:w|ETò:abԼ vxu}5 e}ey]cq粰\>I"Akt1NT;g^D9dJ>SWk.H{7fKT }gm{[ݤ}wkx}076Rj&P){h;GEXC{#Q}݁z`FŸ'LA!ԁwb?Դur 31Cp3/=ws3*`)YIF@JOvp5왙]|ṝoD`?V/#Om2O.^z}||=xtz}w|Olx1U?3>_&x_g9m|\TL/4\_B}JQÓ8_M{[vnyI} ;G) 1=Ʀ1S~n{$kQo'|\x|t>ʍ\{4B$,˗?ߧ|"<Wfms L;K6^M/| TH*I-u*d3`0;`sBظdkrNƿڏIm5x$~3FUixc M'fӭqU{6׿Cd'trmxvV}CX fqc##,T['OP3!rdQrq{'_K&H]pՃ\| :L7&h 粏G>to/i<JL𓸺:){ȕ#ɓ_-|ǣg`vXƛ??=>Sn~qnʭ=<pUxEFd:X S7 ;x[*7$V }ՖG> oCvdHܖ:SSk?L/ٱ2'춷&p۴w S-^ȎjwA>Ww^hKo>\O-=s?l4$2W[4o=hҞ|n ^Wv2~ͨCqS{{rk8hx\[d g|μߚ´CU7%[6@7mwޓ6r$+DI*~tƴy!)ɦ(۽ FTĬʪ"šU;2 rhipqu3͖}`^,zW5"üwޞtM|ӒCXCH'o~,Btͼmhlz3/VZf픫5E; b[z'=^b9j[ w;OT ~=>Xa~ e*л}2s`J9H߿XrH>bz9$?dvϤ✈vG#疢p&a>0N&p;pGӝ@nx;š8qo3XaLiX3#}<_ckVb-0aW1KY8ܕX?Jp)9N\,5 iHH $fYj\DvE!.+jeI@4ѣ|҈0܎j1LɘZ*guh<(XFTkĤ JΣV,"uEGָmYЋGv)B 9EVۂ}kG,8ȚA,E۽`Sx3(j>xqj3'o/f*S/]Χ. }*#U8LܼvzZ vUo;>=D4>Ϫbޕ\? "f1J QT; @P׎'cҿlG vjXX vJKԊXdH[~*ߎXh$q5[+CnbC kUJJ"ܷ"BBkT=Nr%kjjFRV3#4[P\_[PEaKpÇ_q h]MFO Q8g I,swQz]B * pܓlO_]\:}j=vJM鞸x1I4ʮo]-T D?Omu9k 1y#b`vrw9sqgޜj&g+{X,aYTg6nlk Rb%Ĝk0<~Iv _t{@WfP#FP-dZ՜K<뎼Doj2^#a4[8^# Rncc${ְEi^)GHi\9n?[jz6温SZrlkuth?V)V)]vlR LVz'eAjY/?E`?zG-rQ0Q:of#5{ ̦JjW qUWbʕb+eְ(D# Ǒȸ :#!{aN"FY)0ƒbc,T( 4)ٹ %<&%&@=ydc챖 W\HRpH! DZ8 Q8)i#֊srlek 0 Jt2ш{x O #5_CDCID[^iQ封`Lz1 XAp*bbIap ւr%l 5r_A{w-PA#>3#H``RdCq j{qn'W"d4;F_5s&{Αy-cT6gYaMq؊4rz kZ ڰaM1՟撪QyA_+/3'D}%*>iC=DBnv`7V~\G~/6(eQesr;ĵafv Xʹ_1OP /~YR=Bh_c /D zc Zx]D3W vSGq{_=˒G%La·H+F|)EUq}j@ 0殨LTq]kU:)Qk H ֍K8 S&3`!TYe{0wQݍr>o޽5NڄE0϶,E0|{+M=#o*I껫dW޽!̆wOCG`&e>=`޽AwfT36ӯbwn <P)&ۃwRa@o_ / [ KZPkZFL:l!tMA&0% yLW>{eI!碎K7>b< B0E]%شƈ"ؔ< 2Ӡ 90Gޤ+iX3U<"fx 8HOq(/m:Qy2`l(' l,&Q`-hT_0=ukx2"6^c2?5{e成l:"@}rG>0J^W^1(eJϻOBQ s^~ 3nx).)Ps۝j~n|}? Ox(TLUŻ_A b:_LjׂZ;!P'd!>/H6݀[g'J? Wz=JǨn[ E ϐ܈Kwm(:PVy-)0E᝗E魥>oF HWjwk f9?g~D+cK͋UC|H 8q9/rCa)sE Q*Qxyay&!PUZˆg&u@~/FSnǜj5a l(@sJ’yPc洷[=T$ʻ}z9D:CO$w5)@^ ٻf&Dk!2ٽoCSZ摽E2Y^H&zx[2 Qeur9{'f0ZAQ`*th"iVapb `?0rn흔`!I*C:0BN!Fz'*{ի5O={Q*s{6$EbgT?bg3"OIn]MRvwod%gWaKd1Wƕ~Cٮ>?vjV*lvŠ߿iy/1U՛4;On #)UH;w"tl8\f?ʄa;t&[e>V*PhȑF|Ac_w\#g/gLV%R؃U:cth7mQAOo!,k-2Cz=[T_M9EcKqM3?錯C􈪢.8I_o J5kbP诿܄f 9GN50,53@G (:>bL_so\eMJ͈aUFUR)h 0 Xr2~ kM#DMJm֏ҷK6cK!ZM'qJJ XC˒$$K@ÛܯǞ+);>buKܝ n yow7dGơe+ǽ5`@[:FL(^.b< &R/Nw =r'ݖcf~;>*(:rD31fhpx&uv=~Kb~4ϸ &ᒤ6Yts *zK@ >RE)Z6Wl}˅0}kuػ4-CXW6A VUFApC`58ʂFLR(L:6Oܙ{ݑ0  j`\n62ֲayh?4wYwHY5>Kac}ZߤyǡPxf'n:A?|xaNbzԋ^q][% 1cp$>v"ߏ~X@+;ߡoFHXÛg4xAxr{#%[پ@BQվ cB|*%S# }jɝ/^dӳyƍd XF +d`0mz|3M؄?0A /A=u$ޕf!_sx%ibv30o\Lo"d2;a`cl9FұcxE 1h]",ݩ nrG3yꁲyو7˘jVUFѽ*uxF&&k˱2u^hѡ1y1lJG;BT-0N9duz{͉Xaf`j\ z=A#tP ~m͋N)[q *ԆquG')-w _w"juvC'+Q#.-n*O_{(ʁ/.kj_%vg+y[< QE|Nnr\fIj٢'/u\+&FH!4+yJ g^/[)S-ĭ\12urǘfP'P4PnD&>&4 qBz[EqrOyܽS  kr;N~F( "z$<8$gzd*++S* T&j˄ywRI啊" ߆|j >渥TTJ#n9Pô|i%+ZuDh6†,x.0 }m#.7+ְJv9pjқ@>VoK/1Ju:QV p-,1뀀@S>o'iI9 *#&Th y;O^lqQk+5CKKwǤ6wo6J7Iqh1{ҝ(;I+kF8of]r}gm襁#fS+@H+FwC'\E9}f,e:);v̺FkBcre빕(=xt>` KٞKŘ(r4ZQf8' 0*Tނ|jӾV=ƴj]Dhw%|[,x4FsTd6*L]`-;('\xrVוD%0Md9iYTDR *-3G),r8B (MfR|qIY݆dz+ydr%nX<}~fd4T/?>.7h??<j%3N%>MR߯nx)W vL? d:[㧇kTlk?E)Z?㟙-{.'毯_Br=;b@zw R*[8E |n.^kU|E J%(USL qNP#A仩>bSm+s#L')MF>Gse"UYQCQƥD2PZgmuTYGDB.YM)Sq%)hhQ>P.[ "@D u Š4 R`ԡ}r\HIg`s*FnX!9RP1i^#B _n3fpV'5}J[PBm}l\JRJzvT7aSBξ,gF;B@d;a9MD<` pFUb:geND!ISd+x8:[45_:ٍd) (!;gpF˲[>|H`B?83Rq9B[_3X޾*0r'K˦؈9O}8SήG?4ʟm ų~4]>D*5*0492+  4zs,jWH^yN,0P )z1Gs)YG=d󊛓\껶ss"Vw?Y=9c8* sKGNѼbvGR>\oݙJ_rUߎ 8t120Л⦆}, =Pj݌&<Y' ..|>&Exwh4 {{Wd Z4pj^mi?]6/gCA햻Nwzn;IIi90r=&LChc5[xf-VγIs\5QX}[VkGJ@j \ڶTf?F diRRMq"AF?J?Di+bȹIʚHx*D\{anZ]M1"y_.?d: r@!^r+E]hKnG*t:_)Єʜq$H .tFlR}VJ[-l6\49;vqVÐҺ; lu!{ңJWId5fZ'Gn|J za\LhKd}Dd(%P%eY Yi)逾QJ}o+q&jS;el1 -p%e=Z/,ӌ!Q#_nBYp"L[FsYT7к9s#*t{*AWWP c0~ciV4;bb sP¢p5o̤舱Q+c?LSaL!oƨ),uq<8CJiAeJ_J=GQrcR h_@k, }o[JhF(õb;6^T?mHXcV7W64жmڰw?B$Sn|x^k`6M4 ܔ/|yU}EBDŽJ/5I{t:hxO5[I5_U0Bi;<ڈG@b˷yoPcCU76jkͮ>\YT#5Ka9! g,ňIgF=čjB^IG>HRWE$3J] ;X >)>H iMTu&n]zsU0bP035]X ǰ@蝕wstwe~N5.-1DY}76ٿwY:w#a,K7"Zih9K2]qOzc9\96$ư^ &_߼-363! 69mr)&Np I$#۵5+5,HJHifj\*Ԝ.ż׸] UƘ CMv()kYkFoEH&)[,9(vr~řܚĞ4˚&WbYMۥeQ<6ٳ6wn[ jo^08|A ,3C7+i7iպ8_\]eOMyJY(Y**XRuQ{)8(yJmeE"㙰˓|q~BNki"Q"1b~ZTwŰG֖7(ZG}? $͇~QQ p>ъ'@G'~ˑ2 ޵^>_c/^Oy ^_mVOq9 V J@Z!鍲;KB%qTUM~ v,rӺv hZĺ,I5*WYRVjjrzqKׂKm!IAG)E$)2'ȣaA1^Y-IB0ZzK;;VvE=RwF.>K ֜ ?!:3$wF ^*N+pR F$KiD<,BpR*|bLwt|fT:X!38bi{~ۧۋλFm~Je+1+@qi$:IW,T>WaG8LZF๰|Jb*Q2RO9/lr񏱈߇JJ#K^YCMhQ"F¤iGgĈ8n-_?c5["ZZh{O6Lh"(R\rlA  mk̛ۏsK8/fwROcS`[S\^F)nۻO}&[ L_u~\7+&f_9oNi\Lyw;۾;4z_13 $Q\2|ed >;B }&jLժe6l8qˑ><=6PslWT }+tI#̂r'ZZxRd@ZSN^;LiDe3b̃*W9}EW$RFr^TI D2Kc\jvѓ Yq\#2EVKYf3S΄$b|}G+ dZ 3&}0Pu-D/G  08뗆)Œ2QZ{@@rCto>_pwx=1+3\[ڡn=:ywqr5 u.n.hCi꧷x|`m- kyֹ8HM>!SOڱp8з*L4ۘw}wOlihGJkQbZ: 泶J7ͺkkg L >o1_Nκs?v3vYK!x8kR=oۛt}d~rj}w?-Wh!^M3I2n<_^&ٖiIN_/|>mcf z_WdLFf6@C^3zj%*Mnɘ3F>vS# >N{$!/\D;ɔzYF3;n1(cn$7r{ڭsRփp$SŹ݌ ۭ+%u6m^4[gBK[EtcB/Ϗg!EtYzn?\]09x$䅋h2ߒv^uuĠ(fºvOV;ZՃp$S{LrwĠ(޹BwwGz r#Zvu!r7Jk5RRNʀsP Frl41fggrZ_]fJ.rs%r}M  [ ^[hapjc:B#w Ebz%֌qPFB m!W yS8fʯ^5M \bCgO/˶åiߊٖ[ pMeGs%^#ɳ=^c[Lh3 >c2|[>q): ئ:iZK`Pސ 5b⦷8}ˌ*ɳ4o]1( :F-%:63}H .2٢N^n| :F}xV?/K [EtcB020"oU4fEGG4|Mo?xpo9L{7 KJB="UBV!۱)¢J},ﲰNb6ۊ |Go35ϭcfBb&)gV7v^뺴*o.͕9_qQ Ghlu뿶a/jW&M@g}zbѷӓ<|~q~uvcXmGDrl?$v!sk':] 0Q"Ad*l$9cu,Hz 1 )%ac^8iga*%xI 0OW%Z.B1*$TEg]|OO:| f,HJ; #C2Ic6 ^n@z\a评(>L_Sd~54yfjoneb6 .B3rf u2b'1nxskt&oj;SfV*X&01nFw__f鼞M|Y,=]ضj1 . H6JV40x%g<@Y3!<%:G喁F؎21Q̬Rm-ۆoכS^!$S 2FO6 )،)j];Ǭ6YMW6UFsfkž$Tc!LEzd}XلhA.̲ƾvp@Oq h*؇ЎR̰^<̬ 櫴k=؜4"fV _,WE61ΚFHy{or.po޷߈}K˻{]B}K͹}KK1-uBT1 Zow٩t렙֩/2NʥY "5pJ|t\w>ΟJZy2XAt"m192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.141378 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44098->192.168.126.11:10357: read: connection reset by peer" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.141467 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.141818 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.143822 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.143913 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.143936 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.145250 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.145627 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098" gracePeriod=30 Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.583670 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:20Z is after 2026-02-23T05:33:13Z Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.854889 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.855331 4816 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098" exitCode=255 Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.855413 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098"} Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.855461 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551"} Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.855653 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.857146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.857275 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.857302 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:21 crc kubenswrapper[4816]: I0316 00:07:21.582523 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:21Z is after 2026-02-23T05:33:13Z Mar 16 00:07:22 crc kubenswrapper[4816]: E0316 00:07:22.324481 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:22Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d29a2d41dfe7d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,LastTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.583683 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:22Z is after 2026-02-23T05:33:13Z Mar 16 00:07:22 crc kubenswrapper[4816]: E0316 00:07:22.735415 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:22Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.739585 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.741494 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.741602 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.741622 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.741675 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:22 crc kubenswrapper[4816]: E0316 00:07:22.746791 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:22Z is after 2026-02-23T05:33:13Z" node="crc" Mar 16 00:07:23 crc kubenswrapper[4816]: I0316 00:07:23.583580 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:23Z is after 2026-02-23T05:33:13Z Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.581979 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:24Z is after 2026-02-23T05:33:13Z Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.660697 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.660992 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.662805 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.662881 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.662908 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.667363 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.673272 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.673350 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.673377 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.674433 4816 scope.go:117] "RemoveContainer" containerID="6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.585824 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:25Z is after 2026-02-23T05:33:13Z Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.874123 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.875254 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.877770 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" exitCode=255 Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.877825 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8"} Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.877884 4816 scope.go:117] "RemoveContainer" containerID="6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.878090 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.879240 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.879280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.879319 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.879958 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:25 crc kubenswrapper[4816]: E0316 00:07:25.880221 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.580926 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:26Z is after 2026-02-23T05:33:13Z Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.808581 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.881774 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.884169 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.885281 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.885320 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.885333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.885957 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:26 crc kubenswrapper[4816]: E0316 00:07:26.886188 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:27 crc kubenswrapper[4816]: I0316 00:07:27.127015 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:07:27 crc kubenswrapper[4816]: I0316 00:07:27.149664 4816 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 16 00:07:27 crc kubenswrapper[4816]: I0316 00:07:27.594587 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:27 crc kubenswrapper[4816]: E0316 00:07:27.769963 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.587455 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.872040 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.872795 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.874528 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.874632 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.874651 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.586005 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:29 crc kubenswrapper[4816]: E0316 00:07:29.743857 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.747975 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.750025 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.750110 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.750137 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.750188 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:29 crc kubenswrapper[4816]: E0316 00:07:29.757524 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:30 crc kubenswrapper[4816]: I0316 00:07:30.588095 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:31 crc kubenswrapper[4816]: W0316 00:07:31.162499 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 16 00:07:31 crc kubenswrapper[4816]: E0316 00:07:31.162681 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.291862 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.292157 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.294950 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.295030 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.295047 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.295963 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:31 crc kubenswrapper[4816]: E0316 00:07:31.296203 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.586876 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.873592 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.873688 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.332482 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d41dfe7d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,LastTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.341299 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.348893 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.356474 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.363401 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2df2b0113 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.759421715 +0000 UTC m=+0.855721668,LastTimestamp:2026-03-16 00:06:47.759421715 +0000 UTC m=+0.855721668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.371133 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.768479757 +0000 UTC m=+0.864779710,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.378926 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.768507708 +0000 UTC m=+0.864807661,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.386027 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.768519078 +0000 UTC m=+0.864819031,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.393258 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.770283342 +0000 UTC m=+0.866583335,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.400859 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.770308312 +0000 UTC m=+0.866608305,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.408538 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.770326442 +0000 UTC m=+0.866626425,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.415488 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.771480614 +0000 UTC m=+0.867780557,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.422588 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.771504345 +0000 UTC m=+0.867804298,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.429526 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.771524235 +0000 UTC m=+0.867824188,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.436729 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.771930293 +0000 UTC m=+0.868230286,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.444848 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.771959423 +0000 UTC m=+0.868259416,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.452041 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.771982094 +0000 UTC m=+0.868282087,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.459414 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.772875671 +0000 UTC m=+0.869175624,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.468484 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.772904371 +0000 UTC m=+0.869204324,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.474686 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.772914312 +0000 UTC m=+0.869214265,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.482095 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.773514023 +0000 UTC m=+0.869814016,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.487989 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.773587264 +0000 UTC m=+0.869887257,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.495228 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.773607615 +0000 UTC m=+0.869907598,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.502510 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.774255527 +0000 UTC m=+0.870555510,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.511438 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.774290908 +0000 UTC m=+0.870590891,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.522865 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a2f7affde1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.170790369 +0000 UTC m=+1.267090332,LastTimestamp:2026-03-16 00:06:48.170790369 +0000 UTC m=+1.267090332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.530165 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a2f7b1a067 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.170897511 +0000 UTC m=+1.267197504,LastTimestamp:2026-03-16 00:06:48.170897511 +0000 UTC m=+1.267197504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.537862 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a2f7b3baf8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.171035384 +0000 UTC m=+1.267335347,LastTimestamp:2026-03-16 00:06:48.171035384 +0000 UTC m=+1.267335347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.546009 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a2f8a0499e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.186538398 +0000 UTC m=+1.282838391,LastTimestamp:2026-03-16 00:06:48.186538398 +0000 UTC m=+1.282838391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.554916 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a2f97f1f50 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.201142096 +0000 UTC m=+1.297442089,LastTimestamp:2026-03-16 00:06:48.201142096 +0000 UTC m=+1.297442089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.563105 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a31b79fcba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.771230906 +0000 UTC m=+1.867530899,LastTimestamp:2026-03-16 00:06:48.771230906 +0000 UTC m=+1.867530899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.567532 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a31c03f106 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.780271878 +0000 UTC m=+1.876571841,LastTimestamp:2026-03-16 00:06:48.780271878 +0000 UTC m=+1.876571841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.569375 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a31c1d710a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.78194305 +0000 UTC m=+1.878243003,LastTimestamp:2026-03-16 00:06:48.78194305 +0000 UTC m=+1.878243003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.574256 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a31c6ef869 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.787286121 +0000 UTC m=+1.883586074,LastTimestamp:2026-03-16 00:06:48.787286121 +0000 UTC m=+1.883586074,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.576055 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a31c7d3b6b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.788220779 +0000 UTC m=+1.884520752,LastTimestamp:2026-03-16 00:06:48.788220779 +0000 UTC m=+1.884520752,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.581653 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a31c83da23 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.788654627 +0000 UTC m=+1.884954620,LastTimestamp:2026-03-16 00:06:48.788654627 +0000 UTC m=+1.884954620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: I0316 00:07:32.581762 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.587577 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a31c981042 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.789979202 +0000 UTC m=+1.886279145,LastTimestamp:2026-03-16 00:06:48.789979202 +0000 UTC m=+1.886279145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.594420 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a31d4d45c6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.801854918 +0000 UTC m=+1.898154891,LastTimestamp:2026-03-16 00:06:48.801854918 +0000 UTC m=+1.898154891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.601687 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a31d738a40 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.804362816 +0000 UTC m=+1.900662779,LastTimestamp:2026-03-16 00:06:48.804362816 +0000 UTC m=+1.900662779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.607885 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a31d8a0a34 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.805837364 +0000 UTC m=+1.902137317,LastTimestamp:2026-03-16 00:06:48.805837364 +0000 UTC m=+1.902137317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.614157 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a31deb1266 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.812196454 +0000 UTC m=+1.908496417,LastTimestamp:2026-03-16 00:06:48.812196454 +0000 UTC m=+1.908496417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.621318 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a32ec9a88d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.095219341 +0000 UTC m=+2.191519304,LastTimestamp:2026-03-16 00:06:49.095219341 +0000 UTC m=+2.191519304,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.629160 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a32fadc4b3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.110168755 +0000 UTC m=+2.206468748,LastTimestamp:2026-03-16 00:06:49.110168755 +0000 UTC m=+2.206468748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.635881 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a32fc89d41 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.111928129 +0000 UTC m=+2.208228122,LastTimestamp:2026-03-16 00:06:49.111928129 +0000 UTC m=+2.208228122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.642818 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a33dbdbfa1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.346097057 +0000 UTC m=+2.442397030,LastTimestamp:2026-03-16 00:06:49.346097057 +0000 UTC m=+2.442397030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.649708 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a33e93d854 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.360128084 +0000 UTC m=+2.456428057,LastTimestamp:2026-03-16 00:06:49.360128084 +0000 UTC m=+2.456428057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.656058 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a33ea5db64 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.361308516 +0000 UTC m=+2.457608469,LastTimestamp:2026-03-16 00:06:49.361308516 +0000 UTC m=+2.457608469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.662589 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a34c7cb99d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.593493917 +0000 UTC m=+2.689793870,LastTimestamp:2026-03-16 00:06:49.593493917 +0000 UTC m=+2.689793870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.669800 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a34d7242e7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.609585383 +0000 UTC m=+2.705885336,LastTimestamp:2026-03-16 00:06:49.609585383 +0000 UTC m=+2.705885336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.677031 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a35249cbd0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.690819536 +0000 UTC m=+2.787119489,LastTimestamp:2026-03-16 00:06:49.690819536 +0000 UTC m=+2.787119489,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.684515 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a35261c220 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.69238992 +0000 UTC m=+2.788689873,LastTimestamp:2026-03-16 00:06:49.69238992 +0000 UTC m=+2.788689873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.691828 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a3527da33d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.694217021 +0000 UTC m=+2.790516974,LastTimestamp:2026-03-16 00:06:49.694217021 +0000 UTC m=+2.790516974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.699285 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a352ba8ca5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.698208933 +0000 UTC m=+2.794508926,LastTimestamp:2026-03-16 00:06:49.698208933 +0000 UTC m=+2.794508926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.705685 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a362973661 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.964328545 +0000 UTC m=+3.060628498,LastTimestamp:2026-03-16 00:06:49.964328545 +0000 UTC m=+3.060628498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.712169 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a362a0900a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.964941322 +0000 UTC m=+3.061241275,LastTimestamp:2026-03-16 00:06:49.964941322 +0000 UTC m=+3.061241275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.719469 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a362a1288b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.964980363 +0000 UTC m=+3.061280316,LastTimestamp:2026-03-16 00:06:49.964980363 +0000 UTC m=+3.061280316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.726474 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a362a7d895 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.965418645 +0000 UTC m=+3.061718598,LastTimestamp:2026-03-16 00:06:49.965418645 +0000 UTC m=+3.061718598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.734435 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a3635f93f7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.977459703 +0000 UTC m=+3.073759656,LastTimestamp:2026-03-16 00:06:49.977459703 +0000 UTC m=+3.073759656,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.741051 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a36384f506 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.979909382 +0000 UTC m=+3.076209335,LastTimestamp:2026-03-16 00:06:49.979909382 +0000 UTC m=+3.076209335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.747248 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a363954488 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.980978312 +0000 UTC m=+3.077278265,LastTimestamp:2026-03-16 00:06:49.980978312 +0000 UTC m=+3.077278265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.753823 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a363a73dd5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.982156245 +0000 UTC m=+3.078456198,LastTimestamp:2026-03-16 00:06:49.982156245 +0000 UTC m=+3.078456198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.760592 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a363b44e39 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.983012409 +0000 UTC m=+3.079312362,LastTimestamp:2026-03-16 00:06:49.983012409 +0000 UTC m=+3.079312362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.768608 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a36437e593 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.991636371 +0000 UTC m=+3.087936324,LastTimestamp:2026-03-16 00:06:49.991636371 +0000 UTC m=+3.087936324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.780644 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a370370dd6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.192907734 +0000 UTC m=+3.289207687,LastTimestamp:2026-03-16 00:06:50.192907734 +0000 UTC m=+3.289207687,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.783022 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a370522c7e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.194685054 +0000 UTC m=+3.290985007,LastTimestamp:2026-03-16 00:06:50.194685054 +0000 UTC m=+3.290985007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.789305 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a370ec8c00 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.204802048 +0000 UTC m=+3.301102011,LastTimestamp:2026-03-16 00:06:50.204802048 +0000 UTC m=+3.301102011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.796043 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a371011dbc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.206150076 +0000 UTC m=+3.302450029,LastTimestamp:2026-03-16 00:06:50.206150076 +0000 UTC m=+3.302450029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.802958 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a37120d9b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.208229814 +0000 UTC m=+3.304529767,LastTimestamp:2026-03-16 00:06:50.208229814 +0000 UTC m=+3.304529767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.808772 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a371838c6f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.214698095 +0000 UTC m=+3.310998048,LastTimestamp:2026-03-16 00:06:50.214698095 +0000 UTC m=+3.310998048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.814540 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a37ce5acb4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.40567826 +0000 UTC m=+3.501978233,LastTimestamp:2026-03-16 00:06:50.40567826 +0000 UTC m=+3.501978233,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.818977 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a37d0a9f86 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.408099718 +0000 UTC m=+3.504399681,LastTimestamp:2026-03-16 00:06:50.408099718 +0000 UTC m=+3.504399681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.824934 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a37df1e6e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.423256803 +0000 UTC m=+3.519556786,LastTimestamp:2026-03-16 00:06:50.423256803 +0000 UTC m=+3.519556786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.829687 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a37e13dc60 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.425482336 +0000 UTC m=+3.521782289,LastTimestamp:2026-03-16 00:06:50.425482336 +0000 UTC m=+3.521782289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.834083 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a37e28af62 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.426847074 +0000 UTC m=+3.523147027,LastTimestamp:2026-03-16 00:06:50.426847074 +0000 UTC m=+3.523147027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.840236 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a38af0d303 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.641289987 +0000 UTC m=+3.737589940,LastTimestamp:2026-03-16 00:06:50.641289987 +0000 UTC m=+3.737589940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.846211 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a38bb8a484 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.654385284 +0000 UTC m=+3.750685257,LastTimestamp:2026-03-16 00:06:50.654385284 +0000 UTC m=+3.750685257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.852536 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a38bcb4692 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.655606418 +0000 UTC m=+3.751906371,LastTimestamp:2026-03-16 00:06:50.655606418 +0000 UTC m=+3.751906371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.860101 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a38f6245ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.715833837 +0000 UTC m=+3.812133780,LastTimestamp:2026-03-16 00:06:50.715833837 +0000 UTC m=+3.812133780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.867280 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a399d30cfd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.890996989 +0000 UTC m=+3.987296952,LastTimestamp:2026-03-16 00:06:50.890996989 +0000 UTC m=+3.987296952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.871416 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a39aaa9abb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.905123515 +0000 UTC m=+4.001423468,LastTimestamp:2026-03-16 00:06:50.905123515 +0000 UTC m=+4.001423468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.877877 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a39e533c0a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.966506506 +0000 UTC m=+4.062806469,LastTimestamp:2026-03-16 00:06:50.966506506 +0000 UTC m=+4.062806469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.883114 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a39f08dcbc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.97840966 +0000 UTC m=+4.074709653,LastTimestamp:2026-03-16 00:06:50.97840966 +0000 UTC m=+4.074709653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.885354 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3cca663f4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:51.743708148 +0000 UTC m=+4.840008101,LastTimestamp:2026-03-16 00:06:51.743708148 +0000 UTC m=+4.840008101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.890186 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3d9f88e2b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:51.967196715 +0000 UTC m=+5.063496708,LastTimestamp:2026-03-16 00:06:51.967196715 +0000 UTC m=+5.063496708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.895020 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3da7fe8d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:51.976067283 +0000 UTC m=+5.072367266,LastTimestamp:2026-03-16 00:06:51.976067283 +0000 UTC m=+5.072367266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.899395 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3da9a9696 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:51.977815702 +0000 UTC m=+5.074115655,LastTimestamp:2026-03-16 00:06:51.977815702 +0000 UTC m=+5.074115655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.905334 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3e5e12583 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.166989187 +0000 UTC m=+5.263289180,LastTimestamp:2026-03-16 00:06:52.166989187 +0000 UTC m=+5.263289180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.912478 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3e6ff6a07 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.185750023 +0000 UTC m=+5.282049986,LastTimestamp:2026-03-16 00:06:52.185750023 +0000 UTC m=+5.282049986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.919003 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3e722a22d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.188058157 +0000 UTC m=+5.284358150,LastTimestamp:2026-03-16 00:06:52.188058157 +0000 UTC m=+5.284358150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.923370 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3f5a42894 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.431427732 +0000 UTC m=+5.527727725,LastTimestamp:2026-03-16 00:06:52.431427732 +0000 UTC m=+5.527727725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.927360 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3f6dccea4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.451917476 +0000 UTC m=+5.548217469,LastTimestamp:2026-03-16 00:06:52.451917476 +0000 UTC m=+5.548217469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.932490 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3f6f563d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.453528531 +0000 UTC m=+5.549828524,LastTimestamp:2026-03-16 00:06:52.453528531 +0000 UTC m=+5.549828524,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.936807 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a405d8f234 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.703322676 +0000 UTC m=+5.799622629,LastTimestamp:2026-03-16 00:06:52.703322676 +0000 UTC m=+5.799622629,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.942997 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a406995351 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.715930449 +0000 UTC m=+5.812230402,LastTimestamp:2026-03-16 00:06:52.715930449 +0000 UTC m=+5.812230402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.950590 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a406aa8885 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.717058181 +0000 UTC m=+5.813358174,LastTimestamp:2026-03-16 00:06:52.717058181 +0000 UTC m=+5.813358174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.957992 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a415166091 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.959006865 +0000 UTC m=+6.055306828,LastTimestamp:2026-03-16 00:06:52.959006865 +0000 UTC m=+6.055306828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.962159 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a416246ddf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.976704991 +0000 UTC m=+6.073004954,LastTimestamp:2026-03-16 00:06:52.976704991 +0000 UTC m=+6.073004954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.969806 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 16 00:07:32 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-apiserver-crc.189d29a605d4ae68 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 16 00:07:32 crc kubenswrapper[4816]: body: Mar 16 00:07:32 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.292977768 +0000 UTC m=+14.389277751,LastTimestamp:2026-03-16 00:07:01.292977768 +0000 UTC m=+14.389277751,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:32 crc kubenswrapper[4816]: > Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.976199 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a605d6e47a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.293122682 +0000 UTC m=+14.389422645,LastTimestamp:2026-03-16 00:07:01.293122682 +0000 UTC m=+14.389422645,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.982398 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:07:32 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a62855d360 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 16 00:07:32 crc kubenswrapper[4816]: body: Mar 16 00:07:32 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.87186672 +0000 UTC m=+14.968166673,LastTimestamp:2026-03-16 00:07:01.87186672 +0000 UTC m=+14.968166673,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:32 crc kubenswrapper[4816]: > Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.989500 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a62856772b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.871908651 +0000 UTC m=+14.968208604,LastTimestamp:2026-03-16 00:07:01.871908651 +0000 UTC m=+14.968208604,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.994796 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 16 00:07:32 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-apiserver-crc.189d29a6426f7023 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 16 00:07:32 crc kubenswrapper[4816]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:07:32 crc kubenswrapper[4816]: Mar 16 00:07:32 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:02.309752867 +0000 UTC m=+15.406052820,LastTimestamp:2026-03-16 00:07:02.309752867 +0000 UTC m=+15.406052820,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:32 crc kubenswrapper[4816]: > Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.000699 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a6427062f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:02.309815029 +0000 UTC m=+15.406114982,LastTimestamp:2026-03-16 00:07:02.309815029 +0000 UTC m=+15.406114982,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.006386 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d29a6426f7023\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 16 00:07:33 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-apiserver-crc.189d29a6426f7023 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 16 00:07:33 crc kubenswrapper[4816]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:07:33 crc kubenswrapper[4816]: Mar 16 00:07:33 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:02.309752867 +0000 UTC m=+15.406052820,LastTimestamp:2026-03-16 00:07:02.317635668 +0000 UTC m=+15.413935651,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:33 crc kubenswrapper[4816]: > Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.011876 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d29a6427062f5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a6427062f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:02.309815029 +0000 UTC m=+15.406114982,LastTimestamp:2026-03-16 00:07:02.317831834 +0000 UTC m=+15.414131827,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.016605 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d29a38bcb4692\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a38bcb4692 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.655606418 +0000 UTC m=+3.751906371,LastTimestamp:2026-03-16 00:07:02.792238026 +0000 UTC m=+15.888538019,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.025196 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a62855d360\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:07:33 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a62855d360 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 16 00:07:33 crc kubenswrapper[4816]: body: Mar 16 00:07:33 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.87186672 +0000 UTC m=+14.968166673,LastTimestamp:2026-03-16 00:07:11.872281899 +0000 UTC m=+24.968581872,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:33 crc kubenswrapper[4816]: > Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.031025 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a62856772b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a62856772b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.871908651 +0000 UTC m=+14.968208604,LastTimestamp:2026-03-16 00:07:11.872352241 +0000 UTC m=+24.968652214,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.036333 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:07:33 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29aa69480dd9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:44098->192.168.126.11:10357: read: connection reset by peer Mar 16 00:07:33 crc kubenswrapper[4816]: body: Mar 16 00:07:33 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:20.141352409 +0000 UTC m=+33.237652392,LastTimestamp:2026-03-16 00:07:20.141352409 +0000 UTC m=+33.237652392,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:33 crc kubenswrapper[4816]: > Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.043521 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29aa69492b5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44098->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:20.141425501 +0000 UTC m=+33.237725484,LastTimestamp:2026-03-16 00:07:20.141425501 +0000 UTC m=+33.237725484,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.048458 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29aa6988ceeb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:20.145596139 +0000 UTC m=+33.241896132,LastTimestamp:2026-03-16 00:07:20.145596139 +0000 UTC m=+33.241896132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.053429 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a31c981042\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a31c981042 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.789979202 +0000 UTC m=+1.886279145,LastTimestamp:2026-03-16 00:07:20.163698708 +0000 UTC m=+33.259998701,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.058643 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a32ec9a88d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a32ec9a88d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.095219341 +0000 UTC m=+2.191519304,LastTimestamp:2026-03-16 00:07:20.406030355 +0000 UTC m=+33.502330348,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.063717 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a32fadc4b3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a32fadc4b3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.110168755 +0000 UTC m=+2.206468748,LastTimestamp:2026-03-16 00:07:20.41973216 +0000 UTC m=+33.516032123,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.074087 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a62855d360\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:07:33 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a62855d360 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 16 00:07:33 crc kubenswrapper[4816]: body: Mar 16 00:07:33 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.87186672 +0000 UTC m=+14.968166673,LastTimestamp:2026-03-16 00:07:31.873662959 +0000 UTC m=+44.969962952,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:33 crc kubenswrapper[4816]: > Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.080503 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a62856772b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a62856772b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.871908651 +0000 UTC m=+14.968208604,LastTimestamp:2026-03-16 00:07:31.873729691 +0000 UTC m=+44.970029674,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: I0316 00:07:33.585790 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:34 crc kubenswrapper[4816]: I0316 00:07:34.584992 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:35 crc kubenswrapper[4816]: I0316 00:07:35.583527 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:35 crc kubenswrapper[4816]: W0316 00:07:35.589333 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 16 00:07:35 crc kubenswrapper[4816]: E0316 00:07:35.589409 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.586759 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:36 crc kubenswrapper[4816]: E0316 00:07:36.750904 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.757745 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.759152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.759215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.759235 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.759276 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:36 crc kubenswrapper[4816]: E0316 00:07:36.764040 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:36 crc kubenswrapper[4816]: W0316 00:07:36.888835 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:36 crc kubenswrapper[4816]: E0316 00:07:36.888901 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:37 crc kubenswrapper[4816]: W0316 00:07:37.333151 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 16 00:07:37 crc kubenswrapper[4816]: E0316 00:07:37.333704 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:37 crc kubenswrapper[4816]: I0316 00:07:37.581775 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:37 crc kubenswrapper[4816]: E0316 00:07:37.770171 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.473051 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.473305 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.475339 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.475412 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.475433 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.587900 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.875508 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.875671 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.876713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.876756 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.876770 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.880743 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.917910 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.918663 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.918699 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.918709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:39 crc kubenswrapper[4816]: I0316 00:07:39.583510 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:40 crc kubenswrapper[4816]: I0316 00:07:40.583279 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:41 crc kubenswrapper[4816]: I0316 00:07:41.584922 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:42 crc kubenswrapper[4816]: I0316 00:07:42.583096 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.586682 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:43 crc kubenswrapper[4816]: E0316 00:07:43.759322 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.764461 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.766756 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.766852 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.766886 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.766947 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:43 crc kubenswrapper[4816]: E0316 00:07:43.773253 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:44 crc kubenswrapper[4816]: I0316 00:07:44.584854 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.586015 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.667882 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.669427 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.669490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.669517 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.670533 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:45 crc kubenswrapper[4816]: E0316 00:07:45.670933 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:46 crc kubenswrapper[4816]: I0316 00:07:46.586716 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:47 crc kubenswrapper[4816]: I0316 00:07:47.585615 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:47 crc kubenswrapper[4816]: E0316 00:07:47.770994 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4816]: I0316 00:07:48.585629 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:49 crc kubenswrapper[4816]: I0316 00:07:49.585635 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.586239 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:50 crc kubenswrapper[4816]: E0316 00:07:50.765020 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.773955 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.775392 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.775477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.775505 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.775589 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:50 crc kubenswrapper[4816]: E0316 00:07:50.782424 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:51 crc kubenswrapper[4816]: I0316 00:07:51.584926 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:52 crc kubenswrapper[4816]: I0316 00:07:52.324610 4816 csr.go:261] certificate signing request csr-zsmc7 is approved, waiting to be issued Mar 16 00:07:52 crc kubenswrapper[4816]: I0316 00:07:52.335672 4816 csr.go:257] certificate signing request csr-zsmc7 is issued Mar 16 00:07:52 crc kubenswrapper[4816]: I0316 00:07:52.412952 4816 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 16 00:07:52 crc kubenswrapper[4816]: I0316 00:07:52.443857 4816 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 16 00:07:53 crc kubenswrapper[4816]: I0316 00:07:53.337429 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-09 07:44:42.497379377 +0000 UTC Mar 16 00:07:53 crc kubenswrapper[4816]: I0316 00:07:53.337511 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7183h36m49.159874094s for next certificate rotation Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.771136 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.782937 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.784472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.784525 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.784543 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.784723 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.793732 4816 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.794034 4816 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.794067 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.798864 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.799057 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.799192 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.799328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.799457 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.819692 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.832119 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.832171 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.832189 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.832216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.832235 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.850322 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.863523 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.863622 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.863644 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.863674 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.863696 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.880161 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.891219 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.891305 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.891326 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.891357 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.891377 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.908071 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.908196 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.908231 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.009115 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.109704 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.210184 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.310917 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.411438 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.512509 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.613162 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.667397 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.669333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.669425 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.669447 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.670593 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.713289 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.813938 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.914488 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.982302 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.984954 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f"} Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.985212 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.986944 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.986987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.987007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.015089 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.115953 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.216943 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.317892 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.418900 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.519913 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.620695 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.721797 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.822492 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.923677 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.990200 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.991373 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.994740 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" exitCode=255 Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.994805 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f"} Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.994878 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.995063 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.996250 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.996301 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.996321 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.997261 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.997540 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.024279 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.124761 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.225201 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.326007 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.426410 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.526627 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.626828 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.726957 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.827789 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.928922 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:00.999951 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.029793 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.130827 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.231689 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.292367 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.292591 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.294187 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.294278 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.294352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.295740 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.296107 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.332889 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.433881 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.534621 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.635038 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.735248 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.836357 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.936855 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.037674 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.138811 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.239276 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.340124 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.440892 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.541661 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.642660 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.743974 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.844665 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.945153 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.046225 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.146789 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.247308 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.348181 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.449161 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.550332 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.651269 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.752038 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.853523 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.954126 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.054836 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.155700 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.256260 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.357526 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.458033 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.559439 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.659993 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: I0316 00:08:04.666842 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:08:04 crc kubenswrapper[4816]: I0316 00:08:04.668586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4816]: I0316 00:08:04.668835 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4816]: I0316 00:08:04.668983 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.760649 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.860828 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.961492 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.062296 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.163078 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.263701 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.363858 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.464010 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.564523 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.665573 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.766201 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.866683 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.967834 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.068600 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.169082 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.269640 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.370630 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.471647 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.572936 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.673623 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.774613 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.808116 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.808377 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.809999 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.810073 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.810102 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.811067 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.811348 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.875144 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.975868 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.076360 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.176845 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.277898 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.378659 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.479847 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.580974 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.681386 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: I0316 00:08:07.690607 4816 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.771347 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.781610 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.882199 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.982624 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.083773 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.184628 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.219538 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.225481 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.225544 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.225599 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.225635 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.225660 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.245216 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.250453 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.250503 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.250524 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.250583 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.250605 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.268234 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.274167 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.274234 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.274249 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.274276 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.274293 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.289363 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.295277 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.295354 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.295373 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.295398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.295416 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.311770 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.312021 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.312075 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.412449 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.513340 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.592911 4816 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.607671 4816 apiserver.go:52] "Watching apiserver" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.614312 4816 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.614725 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.615348 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.615873 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.616008 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.616240 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.616305 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.616360 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.616385 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617114 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617156 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.617211 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617233 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617289 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617310 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.620945 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.621233 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.621307 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.621612 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.624868 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.625480 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.625475 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.626959 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.630670 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.670273 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.686223 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.697013 4816 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.704658 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.719880 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.720649 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.720687 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.720700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.720721 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.720735 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.740736 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.747726 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.747775 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.747811 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.747846 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748009 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748047 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748081 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748115 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748149 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748186 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748220 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748250 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748281 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748275 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748312 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748371 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748411 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748446 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748481 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748516 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748577 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748609 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748645 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748675 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748709 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748740 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748775 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748782 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748843 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748806 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.748948 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:09.248902793 +0000 UTC m=+82.345202786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749015 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749036 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749072 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749114 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749153 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749196 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749226 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749248 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749302 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749359 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749396 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749404 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749434 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749472 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749506 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749585 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749625 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749639 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749666 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749720 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749763 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749773 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749804 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749897 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749934 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749971 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750004 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750037 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750075 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750110 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750145 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750181 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750214 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750246 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750277 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750308 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750355 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750398 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750436 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750468 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750502 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750517 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750539 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750616 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750597 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750667 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750707 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750744 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750778 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750842 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750882 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750948 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750982 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751016 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751071 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751142 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751176 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751207 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751239 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751270 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751303 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751340 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751373 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751410 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751445 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751485 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751485 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751526 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751540 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751685 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751855 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752029 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752099 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751588 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752540 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752603 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752612 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752636 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752789 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752822 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752872 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752896 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752930 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752951 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752970 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752997 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753018 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753007 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753129 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753038 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753224 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753293 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753351 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753361 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753394 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753413 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753470 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753536 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753631 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753686 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753741 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753760 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753802 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753858 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753916 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753976 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753874 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754067 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754129 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754183 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754234 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754239 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754286 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754340 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754386 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754428 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754524 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754596 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754639 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754678 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754715 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754759 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754779 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754789 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754902 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754951 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754989 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755175 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755223 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755261 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755302 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755340 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755441 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756080 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756146 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756210 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756269 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756326 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756386 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756498 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756535 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756623 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756678 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756795 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756863 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756911 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756970 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757018 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757075 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757133 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757190 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757246 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757308 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757367 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757426 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757490 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757592 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757652 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757703 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757679 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757747 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757790 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757833 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757871 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757910 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757949 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757988 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758027 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758064 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758102 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758146 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758205 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758251 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758289 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758328 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758397 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758461 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758517 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758590 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758630 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758674 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758714 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758792 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758866 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758906 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758952 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759010 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759114 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759183 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759244 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759305 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759363 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759427 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759497 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759612 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759677 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759855 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759950 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760102 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760187 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760227 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760335 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760362 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760387 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760409 4816 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760432 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760456 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760480 4816 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760502 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760525 4816 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760583 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760608 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760631 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760656 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760678 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760699 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760721 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760745 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760766 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760786 4816 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760808 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760829 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760850 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760874 4816 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760895 4816 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760917 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760941 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760962 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760983 4816 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.761005 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.761028 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.762350 4816 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755943 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756041 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756244 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756284 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755464 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756317 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756346 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756462 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756726 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756749 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756779 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757067 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757105 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757438 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763213 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763227 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758137 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758125 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758405 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758422 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758503 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759007 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759049 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759054 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759092 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759122 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759192 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760172 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760212 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760239 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760841 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760895 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.761910 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.762065 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.762172 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.762537 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763277 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763518 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763607 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763736 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763600 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.764046 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.764603 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.764788 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.764942 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765052 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765080 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765191 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765294 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765752 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765906 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.766597 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.766657 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.767197 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.767584 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.768061 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.768371 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.768371 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.768526 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769107 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769401 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769458 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769656 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769843 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769920 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.770337 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.770498 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.770866 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.771144 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.771421 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.773221 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.773321 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:09.273294046 +0000 UTC m=+82.369594039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.774514 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.770951 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.777358 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:09.277334241 +0000 UTC m=+82.373634204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769693 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.784338 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.784865 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.785348 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.787852 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.788147 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.789724 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.789774 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.789804 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.789922 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:09.289884299 +0000 UTC m=+82.386184432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.792453 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.792492 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.792520 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.792645 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:09.29261707 +0000 UTC m=+82.388917203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.792683 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.792767 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.792946 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793139 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793257 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793376 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793467 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793506 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793673 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.794448 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.795250 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.796399 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.796843 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.798819 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.798830 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.799419 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.799811 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.799829 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.799913 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.800304 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.800512 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.800842 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.800903 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.801434 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.802618 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.802978 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.803001 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.803164 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.803633 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.803853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.804650 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.805901 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.805987 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.806146 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.806326 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.806960 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.807114 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.807147 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.809617 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.810027 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.810305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.811864 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.811906 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.812092 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.812223 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.812596 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.813443 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.813727 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.814297 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.814770 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.814893 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.815017 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.815041 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.815079 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.815231 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.815720 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.816175 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.816187 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.816952 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.816995 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.816898 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.817007 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.818881 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.819215 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.819371 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.819430 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820019 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820148 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820157 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820430 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820451 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820660 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820805 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821036 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821050 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821407 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821474 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821489 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821523 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821604 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822147 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822249 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822359 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822727 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822760 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822804 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822824 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822841 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822893 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822837 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822957 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.824325 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.824680 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.825045 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.842632 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.845981 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.858035 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.860679 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.861937 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.861992 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862054 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862085 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862110 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862129 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862150 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862166 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862186 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862204 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862220 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862121 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862235 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862311 4816 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862338 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862358 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862379 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862398 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862417 4816 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862437 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862457 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862474 4816 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862493 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862512 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862531 4816 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862573 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862593 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862611 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862636 4816 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862654 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862674 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862692 4816 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862710 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862729 4816 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862748 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862801 4816 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862818 4816 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862836 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862897 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862919 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862937 4816 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862956 4816 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862977 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862996 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863015 4816 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863035 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863055 4816 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863076 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863097 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863117 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863136 4816 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863155 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863174 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863194 4816 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863212 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863231 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863249 4816 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863267 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863285 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863302 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863319 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863338 4816 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863356 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863375 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863395 4816 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863414 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863434 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863454 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863471 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863490 4816 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863507 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863525 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863545 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863585 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863605 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863627 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863646 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863665 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863685 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863705 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863724 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863743 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863762 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863784 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863805 4816 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863824 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863842 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863860 4816 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863877 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863894 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863911 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863930 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863947 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863965 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863983 4816 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864002 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864020 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864037 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864054 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864071 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864089 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864106 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864124 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864143 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864160 4816 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864177 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864195 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864212 4816 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864229 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864246 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864264 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864281 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864298 4816 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864316 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864334 4816 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864352 4816 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864370 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864388 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864404 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864422 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864440 4816 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864458 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864476 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864493 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864509 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864527 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864545 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864582 4816 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864602 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864620 4816 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864637 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864655 4816 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864674 4816 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864692 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864713 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864730 4816 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864748 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864767 4816 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864786 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864803 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864820 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864837 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864854 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864872 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864891 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864908 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864925 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864942 4816 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864961 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864977 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864999 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865017 4816 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865034 4816 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865051 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865068 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865086 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865103 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865120 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865138 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865156 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865174 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865191 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865207 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865226 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865243 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865260 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865277 4816 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865295 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.927045 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.927090 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.927123 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.927146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.927158 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.939275 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.947359 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.953487 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.967199 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:08 crc kubenswrapper[4816]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 16 00:08:08 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:08 crc kubenswrapper[4816]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 16 00:08:08 crc kubenswrapper[4816]: source /etc/kubernetes/apiserver-url.env Mar 16 00:08:08 crc kubenswrapper[4816]: else Mar 16 00:08:08 crc kubenswrapper[4816]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 16 00:08:08 crc kubenswrapper[4816]: exit 1 Mar 16 00:08:08 crc kubenswrapper[4816]: fi Mar 16 00:08:08 crc kubenswrapper[4816]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 16 00:08:08 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:08 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.968350 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 16 00:08:08 crc kubenswrapper[4816]: W0316 00:08:08.969012 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e093c69c398039152d69290c0e1b80d97325393a60e0e47e56ac753af5c622b7 WatchSource:0}: Error finding container e093c69c398039152d69290c0e1b80d97325393a60e0e47e56ac753af5c622b7: Status 404 returned error can't find the container with id e093c69c398039152d69290c0e1b80d97325393a60e0e47e56ac753af5c622b7 Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.975489 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:08 crc kubenswrapper[4816]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 16 00:08:08 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 16 00:08:08 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:08 crc kubenswrapper[4816]: source "/env/_master" Mar 16 00:08:08 crc kubenswrapper[4816]: set +o allexport Mar 16 00:08:08 crc kubenswrapper[4816]: fi Mar 16 00:08:08 crc kubenswrapper[4816]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 16 00:08:08 crc kubenswrapper[4816]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 16 00:08:08 crc kubenswrapper[4816]: ho_enable="--enable-hybrid-overlay" Mar 16 00:08:08 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 16 00:08:08 crc kubenswrapper[4816]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 16 00:08:08 crc kubenswrapper[4816]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 16 00:08:08 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 16 00:08:08 crc kubenswrapper[4816]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 16 00:08:08 crc kubenswrapper[4816]: --webhook-host=127.0.0.1 \ Mar 16 00:08:08 crc kubenswrapper[4816]: --webhook-port=9743 \ Mar 16 00:08:08 crc kubenswrapper[4816]: ${ho_enable} \ Mar 16 00:08:08 crc kubenswrapper[4816]: --enable-interconnect \ Mar 16 00:08:08 crc kubenswrapper[4816]: --disable-approver \ Mar 16 00:08:08 crc kubenswrapper[4816]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 16 00:08:08 crc kubenswrapper[4816]: --wait-for-kubernetes-api=200s \ Mar 16 00:08:08 crc kubenswrapper[4816]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 16 00:08:08 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 16 00:08:08 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:08 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.978181 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:08 crc kubenswrapper[4816]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 16 00:08:08 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 16 00:08:08 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:08 crc kubenswrapper[4816]: source "/env/_master" Mar 16 00:08:08 crc kubenswrapper[4816]: set +o allexport Mar 16 00:08:08 crc kubenswrapper[4816]: fi Mar 16 00:08:08 crc kubenswrapper[4816]: Mar 16 00:08:08 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 16 00:08:08 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 16 00:08:08 crc kubenswrapper[4816]: --disable-webhook \ Mar 16 00:08:08 crc kubenswrapper[4816]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 16 00:08:08 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 16 00:08:08 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:08 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:08 crc kubenswrapper[4816]: W0316 00:08:08.978328 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-af5042ac8c384074c42d42cc06323d63f664527343509a45a3573a02ad94e5b4 WatchSource:0}: Error finding container af5042ac8c384074c42d42cc06323d63f664527343509a45a3573a02ad94e5b4: Status 404 returned error can't find the container with id af5042ac8c384074c42d42cc06323d63f664527343509a45a3573a02ad94e5b4 Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.979452 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.988632 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.990024 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.028277 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7a475f9be4daee3b9f0f7fd48f4450e70f874432d3f4fee48e7d286c66e3de56"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030190 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"af5042ac8c384074c42d42cc06323d63f664527343509a45a3573a02ad94e5b4"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030365 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030405 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030433 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030461 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030480 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.031625 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e093c69c398039152d69290c0e1b80d97325393a60e0e47e56ac753af5c622b7"} Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.036020 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.036698 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:09 crc kubenswrapper[4816]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 16 00:08:09 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:09 crc kubenswrapper[4816]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 16 00:08:09 crc kubenswrapper[4816]: source /etc/kubernetes/apiserver-url.env Mar 16 00:08:09 crc kubenswrapper[4816]: else Mar 16 00:08:09 crc kubenswrapper[4816]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 16 00:08:09 crc kubenswrapper[4816]: exit 1 Mar 16 00:08:09 crc kubenswrapper[4816]: fi Mar 16 00:08:09 crc kubenswrapper[4816]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 16 00:08:09 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:09 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.037196 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.037852 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.040903 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:09 crc kubenswrapper[4816]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 16 00:08:09 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 16 00:08:09 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:09 crc kubenswrapper[4816]: source "/env/_master" Mar 16 00:08:09 crc kubenswrapper[4816]: set +o allexport Mar 16 00:08:09 crc kubenswrapper[4816]: fi Mar 16 00:08:09 crc kubenswrapper[4816]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 16 00:08:09 crc kubenswrapper[4816]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 16 00:08:09 crc kubenswrapper[4816]: ho_enable="--enable-hybrid-overlay" Mar 16 00:08:09 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 16 00:08:09 crc kubenswrapper[4816]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 16 00:08:09 crc kubenswrapper[4816]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 16 00:08:09 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 16 00:08:09 crc kubenswrapper[4816]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 16 00:08:09 crc kubenswrapper[4816]: --webhook-host=127.0.0.1 \ Mar 16 00:08:09 crc kubenswrapper[4816]: --webhook-port=9743 \ Mar 16 00:08:09 crc kubenswrapper[4816]: ${ho_enable} \ Mar 16 00:08:09 crc kubenswrapper[4816]: --enable-interconnect \ Mar 16 00:08:09 crc kubenswrapper[4816]: --disable-approver \ Mar 16 00:08:09 crc kubenswrapper[4816]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 16 00:08:09 crc kubenswrapper[4816]: --wait-for-kubernetes-api=200s \ Mar 16 00:08:09 crc kubenswrapper[4816]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 16 00:08:09 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 16 00:08:09 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:09 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.043714 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:09 crc kubenswrapper[4816]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 16 00:08:09 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 16 00:08:09 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:09 crc kubenswrapper[4816]: source "/env/_master" Mar 16 00:08:09 crc kubenswrapper[4816]: set +o allexport Mar 16 00:08:09 crc kubenswrapper[4816]: fi Mar 16 00:08:09 crc kubenswrapper[4816]: Mar 16 00:08:09 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 16 00:08:09 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 16 00:08:09 crc kubenswrapper[4816]: --disable-webhook \ Mar 16 00:08:09 crc kubenswrapper[4816]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 16 00:08:09 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 16 00:08:09 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:09 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.044950 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.045615 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.054821 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.067077 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.078521 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.087944 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.098767 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.111413 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.122580 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.133173 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.133974 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.134027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.134046 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.134071 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.134094 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.145919 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.160188 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.171203 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.237523 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.237595 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.237608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.237631 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.237661 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.269210 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.269389 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:10.269365579 +0000 UTC m=+83.365665532 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.339821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.339864 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.339874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.339892 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.339903 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.370595 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.370659 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.370695 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.370718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370793 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370861 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370876 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370882 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370889 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:10.370867752 +0000 UTC m=+83.467167775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370895 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370891 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370983 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:10.370963285 +0000 UTC m=+83.467263268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370902 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.371068 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:10.371053368 +0000 UTC m=+83.467353361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370909 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.371121 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:10.37110923 +0000 UTC m=+83.467409223 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.443328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.443370 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.443382 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.443399 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.443415 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.546235 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.546296 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.546317 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.546339 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.546353 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.649537 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.650004 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.650024 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.650053 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.650075 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.671350 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.671874 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.673124 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.673768 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.674686 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.675141 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.675682 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.676526 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.677138 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.678013 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.678494 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.679512 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.679989 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.680459 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.681278 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.681789 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.682749 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.683101 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.683742 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.684645 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.685102 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.686114 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.686535 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.687537 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.688133 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.688778 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.689840 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.690279 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.691153 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.691596 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.692368 4816 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.692465 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.693959 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.694823 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.695297 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.696785 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.697373 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.698192 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.698863 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.699893 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.700322 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.701226 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.701801 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.702726 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.703176 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.704127 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.704625 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.705657 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.706112 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.706906 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.707356 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.708198 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.708749 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.709188 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.752198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.752245 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.752256 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.752274 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.752285 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.854425 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.854491 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.854512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.854542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.854597 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.957231 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.957297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.957319 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.957352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.957373 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.060055 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.060141 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.060169 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.060218 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.060242 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.163048 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.163124 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.163151 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.163180 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.163203 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.266367 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.266432 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.266451 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.266477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.266495 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.279774 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.279937 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:12.279914117 +0000 UTC m=+85.376214070 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.368897 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.368962 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.368979 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.369007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.369026 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.380739 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.380808 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.380859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.380875 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.380949 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.380976 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:12.380950314 +0000 UTC m=+85.477250297 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381062 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381152 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381197 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381224 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381200 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381167 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:12.381146421 +0000 UTC m=+85.477446414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381314 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:12.381299006 +0000 UTC m=+85.477598999 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381278 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381349 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381418 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:12.381394409 +0000 UTC m=+85.477694402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.471437 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.471528 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.471569 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.471595 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.471615 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.573905 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.573982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.573991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.574005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.574028 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.666995 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.667057 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.667013 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.667314 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.667719 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.667600 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.676587 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.676628 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.676638 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.676653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.676663 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.780093 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.780162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.780180 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.780207 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.780225 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.882181 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.882218 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.882228 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.882247 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.882259 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.992918 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.992990 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.993008 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.993032 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.993057 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.096147 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.096227 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.096248 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.096272 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.096290 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.199590 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.199653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.199675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.199705 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.199727 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.302950 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.303016 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.303039 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.303072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.303092 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.405991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.406044 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.406062 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.406085 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.406104 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.509047 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.509112 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.509134 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.509163 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.509185 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.611845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.611894 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.611910 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.611935 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.611951 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.714581 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.714669 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.714687 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.714713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.714729 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.817636 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.818071 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.818258 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.818459 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.818669 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.921966 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.922040 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.922060 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.922117 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.922140 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.025212 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.025267 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.025309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.025335 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.025352 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.128017 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.128086 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.128107 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.128132 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.128152 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.230146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.230199 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.230215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.230243 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.230260 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.298971 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.299333 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:16.299131131 +0000 UTC m=+89.395431084 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.332817 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.332848 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.332860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.332877 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.332889 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.400261 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.400349 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.400391 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.400427 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400479 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400533 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400584 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:16.400562901 +0000 UTC m=+89.496862854 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400608 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400640 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:16.400618373 +0000 UTC m=+89.496918416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400651 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400658 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400696 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400719 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400671 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400774 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:16.400756348 +0000 UTC m=+89.497056351 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400889 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:16.400846651 +0000 UTC m=+89.497146634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.436075 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.436139 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.436157 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.436181 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.436199 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.538965 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.539027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.539045 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.539072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.539090 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.642575 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.642616 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.642627 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.642645 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.642658 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.667303 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.667338 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.667455 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.667312 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.667661 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.667750 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.745516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.745578 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.745589 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.745607 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.745618 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.848772 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.848817 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.848827 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.848845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.848857 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.951492 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.951542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.951565 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.951582 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.951596 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.054083 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.054130 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.054141 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.054158 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.054171 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.156845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.157132 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.157152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.157175 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.157190 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.260602 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.260680 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.260734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.260768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.260789 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.363744 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.363811 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.363834 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.363869 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.363907 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.466741 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.466821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.466843 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.466874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.466894 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.569942 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.570008 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.570027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.570054 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.570071 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.673348 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.673414 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.673440 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.673471 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.673498 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.776694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.776750 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.776761 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.776781 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.776792 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.879838 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.879917 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.879931 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.879955 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.879969 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.982053 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.982097 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.982106 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.982125 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.982136 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.085034 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.085592 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.085677 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.085746 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.085807 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.189749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.190245 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.190326 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.190405 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.190484 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.293650 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.293718 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.293746 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.293776 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.293800 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.396594 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.396648 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.396695 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.396720 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.396738 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.499531 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.499628 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.499653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.499683 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.499704 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.603091 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.603145 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.603161 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.603184 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.603202 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.667411 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.667455 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.667468 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:14 crc kubenswrapper[4816]: E0316 00:08:14.667634 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:14 crc kubenswrapper[4816]: E0316 00:08:14.667760 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:14 crc kubenswrapper[4816]: E0316 00:08:14.667881 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.705672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.705737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.705758 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.705789 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.705810 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.808950 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.809077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.809097 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.809123 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.809141 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.913042 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.913121 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.913143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.913174 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.913195 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.016204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.016331 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.016366 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.016399 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.016423 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.119508 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.119635 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.119661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.119696 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.119720 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.222819 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.222905 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.222922 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.222949 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.222967 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.326096 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.326153 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.326170 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.326192 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.326208 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.429436 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.429517 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.429532 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.429575 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.429589 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.532781 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.532840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.532863 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.532888 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.532907 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.636013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.636081 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.636099 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.636129 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.636148 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.738368 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.738437 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.738455 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.738482 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.738502 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.841347 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.841416 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.841429 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.841452 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.841468 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.944143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.944200 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.944209 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.944229 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.944241 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.047205 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.047273 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.047289 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.047314 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.047337 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.150055 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.150124 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.150141 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.150166 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.150184 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.253002 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.253069 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.253080 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.253096 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.253140 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.338655 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.338913 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.338886913 +0000 UTC m=+97.435186906 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.355940 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.356005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.356027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.356066 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.356115 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.439644 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.439715 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.439759 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.439800 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.439966 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440023 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440027 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440172 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.440137277 +0000 UTC m=+97.536437270 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440047 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.439991 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440273 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.440250771 +0000 UTC m=+97.536550764 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440361 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.440332154 +0000 UTC m=+97.536632147 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440052 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440405 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440429 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440488 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.440474189 +0000 UTC m=+97.536774172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.458698 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.458784 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.458808 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.458840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.458863 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.562443 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.562534 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.562573 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.562608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.562639 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.665431 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.665493 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.665512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.665539 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.665593 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.667073 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.667158 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.667323 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.667429 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.667662 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.667787 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.768600 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.768671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.768692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.768737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.768765 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.872192 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.872251 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.872263 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.872281 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.872293 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.976100 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.976159 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.976172 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.976194 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.976210 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.997613 4816 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.079836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.079906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.079923 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.079949 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.079968 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.182598 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.182672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.182689 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.182715 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.182734 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.285765 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.285836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.285847 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.285868 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.285883 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.389136 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.389265 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.389288 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.389312 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.389326 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.491810 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.491867 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.491883 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.491904 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.491916 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.594216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.594267 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.594287 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.594312 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.594334 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.686014 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.697434 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.697509 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.697535 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.697597 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.697618 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.706951 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.722018 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.732119 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.741465 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.749357 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.799763 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.799818 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.799842 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.799866 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.799884 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.906994 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.907617 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.907680 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.907718 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.907742 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.010606 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.010690 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.010710 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.010736 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.010753 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.112895 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.112945 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.112956 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.112974 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.112986 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.215991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.216050 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.216062 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.216077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.216085 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.318407 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.318460 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.318477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.318500 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.318519 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.374455 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.374502 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.374514 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.374534 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.374574 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.389134 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.394855 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.394914 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.394934 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.394960 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.394977 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.410253 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.417203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.417250 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.417261 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.417279 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.417290 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.433076 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.437762 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.437796 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.437808 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.437824 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.437836 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.453051 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.457133 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.457226 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.457241 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.457285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.457302 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.469394 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.470209 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.472587 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.472635 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.472655 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.472682 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.472701 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.576013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.576061 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.576074 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.576094 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.576107 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.667432 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.667518 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.667452 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.667633 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.667745 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.667863 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.678840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.678904 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.678931 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.678959 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.678980 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.781600 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.781642 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.781653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.781670 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.781681 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.884247 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.884313 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.884337 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.884372 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.884396 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.986842 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.986914 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.986932 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.986955 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.986973 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.089646 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.089689 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.089699 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.089715 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.089725 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.192859 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.192924 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.192938 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.192955 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.192964 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.295910 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.295959 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.295971 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.295988 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.296000 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.398661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.398725 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.398747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.398776 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.398796 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.501172 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.501250 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.501292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.501323 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.501345 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.604411 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.604479 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.604500 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.604529 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.604590 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.706982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.707031 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.707043 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.707061 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.707074 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.809338 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.809383 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.809396 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.809414 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.809426 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.911589 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.911747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.911761 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.911784 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.911797 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.015072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.015142 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.015173 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.015207 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.015228 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.118146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.118207 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.118225 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.118381 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.118440 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.221316 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.221364 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.221376 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.221395 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.221409 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.324306 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.324358 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.324376 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.324399 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.324415 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.427058 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.427101 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.427112 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.427128 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.427140 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.529501 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.529575 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.529589 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.529610 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.529623 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.632395 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.632441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.632458 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.632488 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.632513 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.667398 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.667500 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.667605 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:20 crc kubenswrapper[4816]: E0316 00:08:20.667517 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:20 crc kubenswrapper[4816]: E0316 00:08:20.667718 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:20 crc kubenswrapper[4816]: E0316 00:08:20.668039 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.685515 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.685805 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:20 crc kubenswrapper[4816]: E0316 00:08:20.685950 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.734285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.734324 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.734337 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.734355 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.734367 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.838083 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.838135 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.838146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.838169 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.838182 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.940758 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.940825 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.940887 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.940914 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.940937 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.043468 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.043503 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.043512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.043527 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.043537 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.067916 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.068263 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.068564 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:21 crc kubenswrapper[4816]: E0316 00:08:21.068754 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.084748 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.099231 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.114758 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.131770 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.144366 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.146402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.146441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.146449 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.146465 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.146474 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.158745 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.173578 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.248935 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.249027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.249046 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.249072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.249093 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.353057 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.353115 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.353131 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.353155 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.353172 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.366881 4816 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.456127 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.456204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.456226 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.456255 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.456274 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.558812 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.558865 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.558878 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.558901 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.558914 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.662158 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.662339 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.662378 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.662413 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.662438 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.697061 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.765132 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.765174 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.765186 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.765206 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.765218 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.868657 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.868726 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.868739 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.868757 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.868769 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.971652 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.971700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.971714 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.971734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.971747 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.071979 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.075805 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.075854 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.075868 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.075886 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.075899 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.101214 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.117371 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.132220 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.151701 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.172926 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.178438 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.178492 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.178503 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.178526 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.178538 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.194018 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.212252 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.227269 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.281820 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.281874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.281885 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.281905 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.281919 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.384777 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.384814 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.384823 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.384838 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.384848 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.487596 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.487738 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.487754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.487774 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.487788 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.591370 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.591423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.591435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.591454 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.591466 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.666773 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.666776 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:22 crc kubenswrapper[4816]: E0316 00:08:22.666948 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.667091 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:22 crc kubenswrapper[4816]: E0316 00:08:22.667157 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:22 crc kubenswrapper[4816]: E0316 00:08:22.667345 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.693329 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.693379 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.693391 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.693410 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.693423 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.796939 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.796991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.797006 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.797036 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.797055 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.900765 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.900826 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.900844 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.900874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.900893 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.003441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.003484 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.003495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.003513 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.003525 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.106199 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.106259 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.106272 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.106293 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.106306 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.208958 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.209013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.209028 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.209053 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.209076 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.311674 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.311725 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.311734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.311760 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.311774 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.413884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.413943 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.413954 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.413975 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.414000 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.516519 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.516602 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.516621 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.516647 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.516663 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.619612 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.619679 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.619700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.619729 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.619750 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.722542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.723176 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.723198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.723224 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.723244 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.826383 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.826435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.826447 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.826466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.826480 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.929839 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.929886 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.929899 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.929918 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.929930 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.038245 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.038297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.038310 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.038328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.038341 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.140940 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.140988 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.141000 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.141022 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.141035 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.244263 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.244297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.244309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.244328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.244341 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.346601 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.346631 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.346640 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.346654 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.346664 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.410762 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.411000 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:40.410975108 +0000 UTC m=+113.507275101 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.449603 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.449650 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.449660 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.449679 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.449690 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.512410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.512482 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.512528 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.512607 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512767 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512774 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512845 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:40.512822182 +0000 UTC m=+113.609122175 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512854 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512881 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512988 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:40.512950886 +0000 UTC m=+113.609250989 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513082 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513125 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:40.513112881 +0000 UTC m=+113.609413074 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513213 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513237 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513252 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513291 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:40.513278917 +0000 UTC m=+113.609578880 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.552349 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.552402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.552415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.552435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.552448 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.656671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.656736 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.656757 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.656784 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.656805 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.667676 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.667676 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.667876 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.667711 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.668028 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.668135 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.759774 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.759840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.759860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.759885 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.759903 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.862734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.862776 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.862784 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.862802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.862812 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.966104 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.966164 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.966182 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.966209 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.966231 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.068910 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.068976 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.068992 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.069021 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.069038 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.172422 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.172481 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.172496 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.172515 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.172528 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.276398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.276454 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.276468 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.276492 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.276507 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.379179 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.379264 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.379285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.379317 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.379341 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.482932 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.482992 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.483048 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.483075 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.483093 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.586367 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.586423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.586441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.586466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.586482 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.688836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.688872 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.688882 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.688899 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.688910 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.791660 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.791720 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.791734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.791754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.792114 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.894951 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.895043 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.895060 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.895081 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.895094 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.998393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.998466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.998483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.998508 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.998527 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.101050 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.101116 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.101127 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.101145 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.101161 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.203992 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.204036 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.204047 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.204068 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.204080 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.307148 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.307207 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.307224 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.307253 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.307270 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.410665 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.410717 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.410728 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.410745 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.410757 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.515003 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.515058 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.515075 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.515096 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.515114 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.618131 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.618185 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.618198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.618219 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.618233 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.667493 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.667614 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.667614 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:26 crc kubenswrapper[4816]: E0316 00:08:26.667782 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:26 crc kubenswrapper[4816]: E0316 00:08:26.667962 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:26 crc kubenswrapper[4816]: E0316 00:08:26.668173 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.721882 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.721962 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.721985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.722018 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.722043 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.825806 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.826204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.826298 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.826442 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.826536 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.930584 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.930653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.930666 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.930694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.930708 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.034542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.034616 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.034633 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.034659 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.034680 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.093765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.111959 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.132671 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.138119 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.138204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.138224 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.138289 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.138307 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.150406 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.164657 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.179901 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.194117 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.219382 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.236858 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.241082 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.241141 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.241156 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.241180 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.241199 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.344893 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.344976 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.344991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.345018 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.345034 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.448582 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.448638 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.448653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.448675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.448688 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.553614 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.553667 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.553685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.553715 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.553734 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.656678 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.656748 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.656765 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.656794 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.656811 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.693732 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.716456 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.738613 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.759801 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.759865 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.759875 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.759895 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.759907 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.761863 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.801421 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.826244 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.846564 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.862703 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.862829 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.862959 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.862999 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.863019 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.869728 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.966736 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.966817 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.966834 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.966865 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.966885 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.070448 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.070514 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.070537 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.070613 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.070643 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.173708 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.173834 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.173867 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.173902 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.173926 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.276895 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.276985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.277013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.277049 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.277076 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.380727 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.380793 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.380811 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.380836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.380854 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.483450 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.483507 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.483525 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.483577 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.483597 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.586775 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.586825 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.586842 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.586864 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.586881 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.634776 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.634836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.634854 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.634875 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.634890 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.656445 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.661913 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.661990 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.662016 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.662052 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.662075 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.667507 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.667602 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.667578 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.667792 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.667885 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.668102 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.694771 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.700068 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.700148 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.700161 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.700188 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.700202 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.722148 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.727698 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.727755 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.727774 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.727800 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.727818 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.752034 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.757608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.757672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.757690 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.757714 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.757732 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.778512 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.778832 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.781203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.781280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.781296 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.781322 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.781339 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.885169 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.885237 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.885254 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.885278 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.885295 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.989101 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.989171 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.989182 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.989203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.989215 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.093042 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.093115 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.093136 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.093165 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.093186 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.195716 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.195821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.195841 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.195867 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.195886 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.298792 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.298852 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.298872 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.298896 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.298913 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.401586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.401629 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.401638 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.401655 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.401664 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.504142 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.504187 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.504202 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.504224 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.504238 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.607737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.607818 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.607848 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.607881 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.607907 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.711228 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.711280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.711297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.711321 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.711340 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.814639 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.814708 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.814734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.814764 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.814786 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.917384 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.917454 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.917477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.917508 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.917532 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.020374 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.020783 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.020918 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.021055 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.021180 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.124926 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.124966 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.124979 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.125026 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.125042 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.228071 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.228153 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.228174 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.228203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.228221 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.331198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.331263 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.331282 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.331309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.331328 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.434801 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.434916 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.434939 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.434968 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.434986 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.538309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.538383 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.538401 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.538427 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.538446 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.641782 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.641859 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.641884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.641915 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.641938 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.666713 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.666757 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.666757 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:30 crc kubenswrapper[4816]: E0316 00:08:30.667402 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:30 crc kubenswrapper[4816]: E0316 00:08:30.667611 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:30 crc kubenswrapper[4816]: E0316 00:08:30.667895 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.744951 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.745014 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.745032 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.745057 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.745079 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.848506 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.848584 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.848597 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.848622 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.848639 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.951911 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.951962 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.951978 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.952003 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.952023 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.055605 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.055675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.055693 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.055718 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.055736 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.080011 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-cnhkf"] Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.080490 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.083131 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.083192 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.083355 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.106167 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.127612 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.146446 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.158812 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.158876 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.158900 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.158930 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.158951 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.164953 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.179380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3e686cd4-bddf-463e-b471-e49ea862691e-hosts-file\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.179479 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bwzt\" (UniqueName: \"kubernetes.io/projected/3e686cd4-bddf-463e-b471-e49ea862691e-kube-api-access-9bwzt\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.192293 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.220300 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.236453 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.251117 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.261749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.261817 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.261840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.261864 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.261880 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.267508 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.280977 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3e686cd4-bddf-463e-b471-e49ea862691e-hosts-file\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.281216 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bwzt\" (UniqueName: \"kubernetes.io/projected/3e686cd4-bddf-463e-b471-e49ea862691e-kube-api-access-9bwzt\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.281219 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3e686cd4-bddf-463e-b471-e49ea862691e-hosts-file\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.310246 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bwzt\" (UniqueName: \"kubernetes.io/projected/3e686cd4-bddf-463e-b471-e49ea862691e-kube-api-access-9bwzt\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.364890 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.364936 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.364950 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.364970 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.364984 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.402404 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: W0316 00:08:31.416987 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e686cd4_bddf_463e_b471_e49ea862691e.slice/crio-ee4ee15b9147a70ac3d21b58b9d3cb23b0646c2fdf284bbf82e86743ee26a221 WatchSource:0}: Error finding container ee4ee15b9147a70ac3d21b58b9d3cb23b0646c2fdf284bbf82e86743ee26a221: Status 404 returned error can't find the container with id ee4ee15b9147a70ac3d21b58b9d3cb23b0646c2fdf284bbf82e86743ee26a221 Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.467532 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mt7bq"] Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.468358 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jrdcz"] Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.468536 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-szscw"] Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.468799 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.469187 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.469861 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.470587 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.470627 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.470641 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.470659 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.470672 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.474472 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.474925 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475120 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475117 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475304 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475238 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475583 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475643 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475545 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475796 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.508156 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.508482 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.543207 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.567215 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.574031 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.574077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.574089 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.574110 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.574123 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.578785 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592454 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-bin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592508 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-cnibin\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592545 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592609 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-netns\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592643 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trwbh\" (UniqueName: \"kubernetes.io/projected/dd08ece2-7636-4966-973a-e96a34b70b53-kube-api-access-trwbh\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592743 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-socket-dir-parent\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592834 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-multus\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592863 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-kubelet\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592890 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-etc-kubernetes\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592924 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd08ece2-7636-4966-973a-e96a34b70b53-mcd-auth-proxy-config\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593009 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfvdb\" (UniqueName: \"kubernetes.io/projected/03ef49f1-0c6a-443a-8df3-2db339c562ed-kube-api-access-xfvdb\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593112 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-conf-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593156 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-multus-daemon-config\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593197 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-os-release\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593279 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dd08ece2-7636-4966-973a-e96a34b70b53-rootfs\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593295 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-hostroot\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593311 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-binary-copy\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593360 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593389 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd08ece2-7636-4966-973a-e96a34b70b53-proxy-tls\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593434 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxf6x\" (UniqueName: \"kubernetes.io/projected/e9789e58-12c8-4831-9401-af48a3e92209-kube-api-access-mxf6x\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593456 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-system-cni-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593480 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-system-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593531 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593588 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-os-release\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593615 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-k8s-cni-cncf-io\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593676 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-multus-certs\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593745 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-cnibin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593813 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-cni-binary-copy\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.594630 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.610626 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.624054 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.639695 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.655229 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.671398 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.676816 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.676844 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.676855 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.676869 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.676878 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695000 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695064 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd08ece2-7636-4966-973a-e96a34b70b53-proxy-tls\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695094 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxf6x\" (UniqueName: \"kubernetes.io/projected/e9789e58-12c8-4831-9401-af48a3e92209-kube-api-access-mxf6x\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695115 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-system-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695136 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695155 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-system-cni-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695174 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-os-release\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695193 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-k8s-cni-cncf-io\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695210 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-multus-certs\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695235 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-cnibin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695252 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-cni-binary-copy\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695272 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695290 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-netns\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695314 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-bin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695341 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-cnibin\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695364 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trwbh\" (UniqueName: \"kubernetes.io/projected/dd08ece2-7636-4966-973a-e96a34b70b53-kube-api-access-trwbh\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695388 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-kubelet\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-etc-kubernetes\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695439 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd08ece2-7636-4966-973a-e96a34b70b53-mcd-auth-proxy-config\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695462 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-socket-dir-parent\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695487 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-multus\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695520 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfvdb\" (UniqueName: \"kubernetes.io/projected/03ef49f1-0c6a-443a-8df3-2db339c562ed-kube-api-access-xfvdb\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695577 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-multus-daemon-config\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695573 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-netns\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695611 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-k8s-cni-cncf-io\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695676 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-multus-certs\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695602 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-os-release\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695704 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-os-release\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695804 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-conf-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695815 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-cnibin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695831 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dd08ece2-7636-4966-973a-e96a34b70b53-rootfs\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-hostroot\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695878 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-binary-copy\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696325 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-bin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696375 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-cnibin\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696717 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-multus\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696765 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-kubelet\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696796 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-etc-kubernetes\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696789 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696781 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-system-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696882 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dd08ece2-7636-4966-973a-e96a34b70b53-rootfs\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696928 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-hostroot\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696941 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-system-cni-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697030 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-conf-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697067 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-os-release\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697132 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697249 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-socket-dir-parent\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697285 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-binary-copy\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697623 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-cni-binary-copy\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697623 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd08ece2-7636-4966-973a-e96a34b70b53-mcd-auth-proxy-config\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.698002 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-multus-daemon-config\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.700532 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd08ece2-7636-4966-973a-e96a34b70b53-proxy-tls\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.702009 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.712643 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trwbh\" (UniqueName: \"kubernetes.io/projected/dd08ece2-7636-4966-973a-e96a34b70b53-kube-api-access-trwbh\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.713869 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.718358 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfvdb\" (UniqueName: \"kubernetes.io/projected/03ef49f1-0c6a-443a-8df3-2db339c562ed-kube-api-access-xfvdb\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.719743 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxf6x\" (UniqueName: \"kubernetes.io/projected/e9789e58-12c8-4831-9401-af48a3e92209-kube-api-access-mxf6x\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.728430 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.738907 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.751441 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.761965 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.774529 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.778742 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.778778 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.778792 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.778813 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.778826 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.788177 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.804174 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.818896 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.818897 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: W0316 00:08:31.831139 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9789e58_12c8_4831_9401_af48a3e92209.slice/crio-d14e6099281a3f4a5c54bb47b271be878d39affb726d275c2082db2728e837cc WatchSource:0}: Error finding container d14e6099281a3f4a5c54bb47b271be878d39affb726d275c2082db2728e837cc: Status 404 returned error can't find the container with id d14e6099281a3f4a5c54bb47b271be878d39affb726d275c2082db2728e837cc Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.835627 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.853318 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.854419 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.862089 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: W0316 00:08:31.869368 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ef49f1_0c6a_443a_8df3_2db339c562ed.slice/crio-2322d7ca26005146ca00985bb21a70c27a081273e8ba13a44e439ebce554fff7 WatchSource:0}: Error finding container 2322d7ca26005146ca00985bb21a70c27a081273e8ba13a44e439ebce554fff7: Status 404 returned error can't find the container with id 2322d7ca26005146ca00985bb21a70c27a081273e8ba13a44e439ebce554fff7 Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.877238 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.880931 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.880961 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.880975 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.880993 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.881006 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.890131 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-psjs7"] Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.891347 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.894808 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.894964 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.895033 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.895466 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.895745 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.895764 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.895983 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 16 00:08:31 crc kubenswrapper[4816]: W0316 00:08:31.897885 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd08ece2_7636_4966_973a_e96a34b70b53.slice/crio-2f008ed10596c37892fa68d6a991ef4c4c25f62429883230c3721018781ad8a8 WatchSource:0}: Error finding container 2f008ed10596c37892fa68d6a991ef4c4c25f62429883230c3721018781ad8a8: Status 404 returned error can't find the container with id 2f008ed10596c37892fa68d6a991ef4c4c25f62429883230c3721018781ad8a8 Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.902681 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.919943 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.944467 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.965317 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.984298 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.984339 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.984348 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.984366 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.984378 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.995408 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998743 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998807 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998836 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998865 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998886 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998910 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998996 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999040 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999060 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999081 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999109 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999167 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999196 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999308 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999439 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:31.999542 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:31.999582 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:31.999599 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rd68\" (UniqueName: \"kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.018269 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.035635 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.051282 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.065290 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.080464 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.092193 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.092247 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.092265 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.092293 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.092313 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.093906 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100356 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100438 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100473 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100495 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100508 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100542 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100586 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100610 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100638 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100673 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101318 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101430 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101500 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101531 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101610 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101665 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101656 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101692 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101745 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101779 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101937 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101800 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101984 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102032 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102057 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102110 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102132 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102160 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102176 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102181 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rd68\" (UniqueName: \"kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102250 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102329 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102319 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102358 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102386 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102742 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102965 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.106199 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.113435 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.113497 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"2f008ed10596c37892fa68d6a991ef4c4c25f62429883230c3721018781ad8a8"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.114479 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerStarted","Data":"8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.114507 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerStarted","Data":"2322d7ca26005146ca00985bb21a70c27a081273e8ba13a44e439ebce554fff7"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.116028 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerStarted","Data":"e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.116074 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerStarted","Data":"d14e6099281a3f4a5c54bb47b271be878d39affb726d275c2082db2728e837cc"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.116514 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.118826 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cnhkf" event={"ID":"3e686cd4-bddf-463e-b471-e49ea862691e","Type":"ContainerStarted","Data":"8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.118863 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cnhkf" event={"ID":"3e686cd4-bddf-463e-b471-e49ea862691e","Type":"ContainerStarted","Data":"ee4ee15b9147a70ac3d21b58b9d3cb23b0646c2fdf284bbf82e86743ee26a221"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.121507 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rd68\" (UniqueName: \"kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.133729 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.146628 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.161145 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.175065 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.190225 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.194907 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.194954 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.194964 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.194979 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.194988 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.207916 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.208354 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: W0316 00:08:32.218928 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ca6e6b1_6b9c_4bb0_8e08_8201c9c53e88.slice/crio-5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761 WatchSource:0}: Error finding container 5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761: Status 404 returned error can't find the container with id 5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761 Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.225095 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.239538 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.262908 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.282450 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.295673 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.296965 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.296994 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.297007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.297023 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.297035 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.323997 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.348226 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.362025 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.376809 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.399162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.399207 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.399216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.399236 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.399246 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.502751 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.502828 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.502847 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.502874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.502892 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.606100 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.606157 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.606178 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.606227 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.606251 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.666866 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.666949 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.667037 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:32 crc kubenswrapper[4816]: E0316 00:08:32.667053 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:32 crc kubenswrapper[4816]: E0316 00:08:32.667128 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:32 crc kubenswrapper[4816]: E0316 00:08:32.667267 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.708789 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.708835 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.708846 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.708863 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.708875 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.811836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.811880 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.811894 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.811913 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.811927 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.914945 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.914977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.914985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.915000 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.915009 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.018613 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.018995 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.019013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.019035 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.019051 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.121833 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.121867 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.121874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.121889 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.121898 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.124594 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd" exitCode=0 Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.124653 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.124675 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.128757 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.134187 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08" exitCode=0 Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.134254 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.143514 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.168650 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.188465 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.210063 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.224371 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.224416 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.224425 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.224442 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.224455 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.230670 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.247593 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.271199 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.287960 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.307306 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.327821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.327861 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.327870 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.327885 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.327896 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.331850 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.357299 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.380208 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.396787 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.414821 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.430466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.430490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.430498 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.430513 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.430522 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.431996 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.451940 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.475946 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.496764 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.514511 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.536307 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.536352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.536364 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.536384 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.536396 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.543458 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.560983 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.574312 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.607081 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.631918 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650337 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650416 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650442 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650459 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650885 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.668329 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.668903 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: E0316 00:08:33.670292 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.753977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.754031 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.754046 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.754075 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.754087 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.855992 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.856037 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.856049 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.856069 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.856090 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.959113 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.959151 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.959163 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.959182 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.959194 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.062749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.062797 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.062810 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.062830 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.062843 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.140234 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300" exitCode=0 Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.140313 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153314 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153387 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153410 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153428 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153464 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.162482 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.166437 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.166475 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.166490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.166513 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.166530 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.183019 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.214602 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.227942 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.240404 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.261432 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.270101 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.270158 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.270174 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.270198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.270213 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.278028 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.292459 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.303144 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.316949 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.331886 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.346295 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.365578 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.372781 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.372808 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.372816 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.372829 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.372839 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.475418 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.475460 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.475470 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.475487 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.475499 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.578509 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.578601 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.578623 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.578647 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.578664 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.667367 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.667417 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.667491 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:34 crc kubenswrapper[4816]: E0316 00:08:34.667683 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:34 crc kubenswrapper[4816]: E0316 00:08:34.667869 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:34 crc kubenswrapper[4816]: E0316 00:08:34.668004 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.681332 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.681385 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.681403 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.681426 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.681443 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.787666 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.787771 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.787829 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.787878 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.787907 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.891365 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.891649 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.891676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.891701 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.891719 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.994353 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.994438 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.994453 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.994472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.994487 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.097006 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.097071 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.097088 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.097115 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.097133 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.160188 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce" exitCode=0 Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.160259 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.203206 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.203271 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.203289 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.203316 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.203333 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.205148 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.226671 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.245481 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.275588 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.292686 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.307380 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.307433 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.307448 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.307469 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.307483 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.312504 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.326875 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.345167 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.358323 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.371191 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.387407 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.402571 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.409672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.409704 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.409713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.409729 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.409738 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.417039 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.513632 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.513714 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.513756 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.513777 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.513831 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.616957 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.617005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.617017 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.617036 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.617048 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.721599 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.721668 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.721685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.721712 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.721729 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.824919 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.824988 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.825018 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.825047 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.825073 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.929257 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.929320 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.929340 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.929364 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.929381 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.032616 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.032686 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.032706 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.032737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.032764 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.135747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.135781 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.135793 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.135809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.135820 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.166681 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc" exitCode=0 Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.166732 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.186623 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.201586 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.230656 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.246351 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.246406 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.246418 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.246443 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.246463 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.252403 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.270010 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.291629 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.306286 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.326359 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.349643 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.349710 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.349725 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.349749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.349765 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.361151 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.391845 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.408640 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.421286 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.437930 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.451750 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.451779 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.451789 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.451806 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.451823 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.555240 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.555297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.555312 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.555330 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.555348 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.658271 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.658316 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.658328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.658345 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.658355 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.666858 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.666883 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:36 crc kubenswrapper[4816]: E0316 00:08:36.666985 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.667052 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:36 crc kubenswrapper[4816]: E0316 00:08:36.667325 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:36 crc kubenswrapper[4816]: E0316 00:08:36.667399 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.760966 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.761028 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.761044 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.761070 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.761087 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.864860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.864903 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.864923 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.864948 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.864964 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.967538 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.967612 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.967626 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.967650 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.967663 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.071310 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.071388 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.071403 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.071423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.071439 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.173676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.173746 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.173773 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.173803 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.173828 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.177098 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512" exitCode=0 Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.177227 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.184659 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.217084 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.236384 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.254419 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.272691 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.276416 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.276446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.276458 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.276475 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.276487 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.287738 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.306924 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.329857 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.343488 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.356515 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.374191 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.379062 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.379111 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.379125 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.379143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.379156 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.386520 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.397479 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.407028 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.481499 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.481541 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.481568 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.481584 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.481594 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.584811 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.584890 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.584902 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.584917 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.584928 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.682803 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.687436 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.687501 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.687526 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.687585 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.687610 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.700159 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.716270 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.734947 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.750510 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.763730 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.789807 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.789851 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.789862 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.789883 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.789897 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.798658 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.813815 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.831421 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.859100 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.872524 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.891495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.891594 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.891617 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.891644 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.891663 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.893668 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.910428 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.994372 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.994406 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.994415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.994428 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.994437 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.097281 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.097349 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.097371 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.097402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.097427 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.194808 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954" exitCode=0 Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.194914 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.200878 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.200951 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.200982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.201013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.201039 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.219699 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.242339 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.274484 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.301962 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.308263 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.308319 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.308332 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.308355 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.308374 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.322233 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.347247 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.367130 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-lhpbn"] Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.367713 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.371191 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.371267 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.371320 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.372788 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.378457 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.395925 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.407399 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.411691 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.411731 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.411748 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.411773 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.411792 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.422685 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.435776 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.450306 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.466265 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.474300 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2x8m\" (UniqueName: \"kubernetes.io/projected/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-kube-api-access-c2x8m\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.474407 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-host\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.474448 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-serviceca\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.480657 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.496827 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.514782 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.514821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.514832 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.514851 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.514865 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.521330 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.540061 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.555753 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.576027 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-host\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.576075 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-serviceca\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.576143 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2x8m\" (UniqueName: \"kubernetes.io/projected/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-kube-api-access-c2x8m\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.576200 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-host\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.577896 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-serviceca\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.581806 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.594074 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2x8m\" (UniqueName: \"kubernetes.io/projected/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-kube-api-access-c2x8m\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.597889 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.613057 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.617290 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.617343 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.617354 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.617376 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.617389 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.626155 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.639309 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.651815 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.662945 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.667535 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.667710 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:38 crc kubenswrapper[4816]: E0316 00:08:38.667786 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:38 crc kubenswrapper[4816]: E0316 00:08:38.667707 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.667811 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:38 crc kubenswrapper[4816]: E0316 00:08:38.668058 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.675998 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.686049 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.699237 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: W0316 00:08:38.711656 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ec6a8ee_efd9_45df_bb35_706fcc90ebe9.slice/crio-d491dbc9accdecc3ae17cc01a0289ebee30340a6935a058858248641b11f3e8f WatchSource:0}: Error finding container d491dbc9accdecc3ae17cc01a0289ebee30340a6935a058858248641b11f3e8f: Status 404 returned error can't find the container with id d491dbc9accdecc3ae17cc01a0289ebee30340a6935a058858248641b11f3e8f Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.719484 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.719527 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.719540 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.719587 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.719608 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.822231 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.822292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.822305 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.822328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.822345 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.924674 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.924744 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.924756 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.924798 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.924811 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.009056 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.009122 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.009132 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.009147 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.009175 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.023506 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.028060 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.028111 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.028127 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.028148 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.028162 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.040479 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.046273 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.046350 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.046369 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.046398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.046420 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.062837 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.074860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.074896 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.074906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.074923 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.074936 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.087995 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.092091 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.092130 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.092143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.092161 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.092176 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.110983 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.111129 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.112728 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.112765 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.112776 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.112793 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.112805 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.203258 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerStarted","Data":"d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.208799 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.209396 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.209442 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.209625 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.211648 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lhpbn" event={"ID":"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9","Type":"ContainerStarted","Data":"a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.211719 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lhpbn" event={"ID":"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9","Type":"ContainerStarted","Data":"d491dbc9accdecc3ae17cc01a0289ebee30340a6935a058858248641b11f3e8f"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.215466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.215495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.215507 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.215522 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.215532 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.218344 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.235039 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.241322 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.244232 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.250018 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.264737 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.280601 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.294179 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.309622 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.317834 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.317884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.317897 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.317915 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.317927 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.333305 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.351798 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.366225 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.387961 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.399992 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.413160 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.420035 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.420072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.420083 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.420102 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.420113 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.423918 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.437867 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.450864 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.475836 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.495511 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.510640 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.522757 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.522800 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.522809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.522823 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.522831 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.542643 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.563722 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.584426 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.601073 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.622596 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.625479 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.625536 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.625575 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.625601 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.625623 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.642201 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.655487 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.677118 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.688316 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.728423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.728477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.728491 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.728511 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.728529 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.831668 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.831697 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.831706 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.831718 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.831726 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.934446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.934494 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.934511 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.934535 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.934576 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.037152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.037217 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.037242 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.037274 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.037297 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.141700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.141754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.141788 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.141811 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.141826 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.244489 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.244574 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.244591 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.244611 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.244628 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.347651 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.347723 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.347740 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.347763 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.347782 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.417867 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.418122 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:12.418076803 +0000 UTC m=+145.514376796 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.450786 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.450845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.450862 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.450887 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.450906 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.519737 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.519816 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.519866 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.519904 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520073 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520100 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520123 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520205 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:12.520177126 +0000 UTC m=+145.616477119 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520201 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520263 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520318 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520343 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:12.52031441 +0000 UTC m=+145.616614393 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520358 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520372 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:12.520358702 +0000 UTC m=+145.616658685 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520384 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520470 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:12.520446385 +0000 UTC m=+145.616746408 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.554594 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.554655 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.554671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.554697 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.554714 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.657656 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.657689 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.657698 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.657713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.657724 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.667722 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.667794 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.667819 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.667898 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.668006 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.668087 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.759676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.759715 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.759725 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.759743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.759756 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.861928 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.861964 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.861974 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.861989 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.862002 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.963981 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.964264 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.964273 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.964287 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.964295 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.068542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.068645 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.068721 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.068754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.068800 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.171637 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.171679 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.171691 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.171707 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.171718 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.275098 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.275147 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.275156 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.275172 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.275182 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.378378 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.378415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.378429 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.378446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.378456 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.481923 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.481987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.482016 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.482048 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.482070 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.585878 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.585945 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.585962 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.585988 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.586007 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.688945 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.689000 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.689017 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.689045 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.689063 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.792583 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.792632 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.792648 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.792670 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.792688 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.895898 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.895954 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.895972 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.895997 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.896016 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.999910 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.999982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.000001 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.000026 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.000045 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.103133 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.103195 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.103212 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.103238 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.103258 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.206205 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.206258 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.206277 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.206301 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.206318 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.225300 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/0.log" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.230351 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651" exitCode=1 Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.230417 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.231645 4816 scope.go:117] "RemoveContainer" containerID="a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.256864 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.277463 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.318462 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.318516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.318537 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.318590 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.318612 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.328977 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"message\\\":\\\"ng *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:41.339626 6672 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:41.339647 6672 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:41.339627 6672 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:41.339804 6672 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:41.339837 6672 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:41.339843 6672 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:41.339871 6672 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:41.340083 6672 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:41.340097 6672 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:41.340108 6672 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:41.340116 6672 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:41.340212 6672 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:41.340278 6672 factory.go:656] Stopping watch factory\\\\nI0316 00:08:41.340297 6672 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.370529 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.386115 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.397322 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.406256 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.416967 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.420324 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.420358 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.420369 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.420386 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.420397 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.434196 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.450586 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.462330 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.480873 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.494423 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.508308 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.523155 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.523223 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.523243 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.523314 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.523326 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.625361 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.625407 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.625424 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.625445 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.625457 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.667485 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.667506 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:42 crc kubenswrapper[4816]: E0316 00:08:42.667638 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.667605 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:42 crc kubenswrapper[4816]: E0316 00:08:42.667687 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:42 crc kubenswrapper[4816]: E0316 00:08:42.667894 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.727967 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.728014 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.728026 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.728043 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.728054 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.830266 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.830318 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.830334 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.830352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.830361 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.932644 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.932676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.932685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.932700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.932727 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.068162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.068211 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.068227 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.068249 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.068445 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.170687 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.170739 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.170753 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.170774 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.170787 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.237134 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/0.log" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.241306 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.241917 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.263292 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.272838 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.272866 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.272875 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.272889 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.272898 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.283065 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.308994 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"message\\\":\\\"ng *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:41.339626 6672 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:41.339647 6672 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:41.339627 6672 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:41.339804 6672 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:41.339837 6672 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:41.339843 6672 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:41.339871 6672 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:41.340083 6672 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:41.340097 6672 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:41.340108 6672 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:41.340116 6672 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:41.340212 6672 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:41.340278 6672 factory.go:656] Stopping watch factory\\\\nI0316 00:08:41.340297 6672 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.327834 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.357186 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.370270 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.375336 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.375378 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.375390 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.375405 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.375415 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.383001 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.402988 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.418315 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.439883 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.452635 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.471745 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.477652 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.477684 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.477692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.477705 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.477714 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.484294 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.497945 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.579606 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.579659 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.579680 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.579704 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.579717 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.681742 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.681793 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.681830 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.681881 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.681896 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.785915 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.786011 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.786062 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.786112 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.786138 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.889014 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.889058 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.889073 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.889095 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.889111 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.991922 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.991970 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.991981 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.992012 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.992025 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.095657 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.095727 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.095742 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.095761 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.095800 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.198441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.198483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.198496 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.198512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.198523 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.246668 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/1.log" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.247371 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/0.log" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.250365 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400" exitCode=1 Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.250421 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.250460 4816 scope.go:117] "RemoveContainer" containerID="a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.251169 4816 scope.go:117] "RemoveContainer" containerID="f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400" Mar 16 00:08:44 crc kubenswrapper[4816]: E0316 00:08:44.251354 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.271081 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.287889 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.300979 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.301022 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.301032 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.301045 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.301056 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.302907 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.324196 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"message\\\":\\\"ng *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:41.339626 6672 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:41.339647 6672 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:41.339627 6672 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:41.339804 6672 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:41.339837 6672 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:41.339843 6672 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:41.339871 6672 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:41.340083 6672 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:41.340097 6672 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:41.340108 6672 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:41.340116 6672 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:41.340212 6672 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:41.340278 6672 factory.go:656] Stopping watch factory\\\\nI0316 00:08:41.340297 6672 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.349393 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.365926 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.381203 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.388056 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5"] Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.388670 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.390314 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.390894 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.392988 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.402761 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.402790 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.402802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.402818 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.402829 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.407865 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.420439 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.436411 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.449374 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.468724 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.485840 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.505593 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.505647 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.505661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.505685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.505700 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.516495 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.529039 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.540843 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.559060 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"message\\\":\\\"ng *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:41.339626 6672 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:41.339647 6672 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:41.339627 6672 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:41.339804 6672 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:41.339837 6672 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:41.339843 6672 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:41.339871 6672 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:41.340083 6672 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:41.340097 6672 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:41.340108 6672 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:41.340116 6672 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:41.340212 6672 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:41.340278 6672 factory.go:656] Stopping watch factory\\\\nI0316 00:08:41.340297 6672 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.573395 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.582086 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.582135 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zgr7\" (UniqueName: \"kubernetes.io/projected/7b28986d-e33b-4876-ab6d-64d69960fb8b-kube-api-access-9zgr7\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.582168 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.582197 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.588195 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.601908 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.607652 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.607685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.607696 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.607713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.607725 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.615720 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.638700 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.653361 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.666781 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.666806 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.666781 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:44 crc kubenswrapper[4816]: E0316 00:08:44.666915 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:44 crc kubenswrapper[4816]: E0316 00:08:44.666984 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:44 crc kubenswrapper[4816]: E0316 00:08:44.667083 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.668643 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.683193 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zgr7\" (UniqueName: \"kubernetes.io/projected/7b28986d-e33b-4876-ab6d-64d69960fb8b-kube-api-access-9zgr7\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.683247 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.683303 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.683366 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.684331 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.684347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.684715 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.690399 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.699871 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.703500 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zgr7\" (UniqueName: \"kubernetes.io/projected/7b28986d-e33b-4876-ab6d-64d69960fb8b-kube-api-access-9zgr7\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.707112 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.709908 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.710001 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.710216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.710413 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.710616 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.715696 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.737689 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.812746 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.812780 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.812791 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.812807 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.812820 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.915474 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.915520 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.915533 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.915590 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.915603 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.019056 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.019120 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.019134 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.019162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.019178 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.122396 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.122443 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.122455 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.122475 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.122488 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.136912 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jqsjn"] Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.137890 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.138051 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.157332 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.185161 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.204651 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.220175 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.224183 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.224215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.224226 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.224241 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.224254 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.234880 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.243607 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.254161 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/1.log" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.256667 4816 scope.go:117] "RemoveContainer" containerID="f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400" Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.256838 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.258025 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.258166 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" event={"ID":"7b28986d-e33b-4876-ab6d-64d69960fb8b","Type":"ContainerStarted","Data":"0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.258199 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" event={"ID":"7b28986d-e33b-4876-ab6d-64d69960fb8b","Type":"ContainerStarted","Data":"544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.258211 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" event={"ID":"7b28986d-e33b-4876-ab6d-64d69960fb8b","Type":"ContainerStarted","Data":"4c04c3194b0c2e512d22c356071f25ccde0add5c85d2b6122133e335f8944a82"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.276993 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.290098 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.290161 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxldb\" (UniqueName: \"kubernetes.io/projected/84360ef9-0450-44c5-80eb-eab1bf8e808b-kube-api-access-pxldb\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.297417 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.309094 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.319499 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.326468 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.326516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.326530 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.326566 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.326578 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.347311 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"message\\\":\\\"ng *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:41.339626 6672 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:41.339647 6672 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:41.339627 6672 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:41.339804 6672 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:41.339837 6672 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:41.339843 6672 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:41.339871 6672 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:41.340083 6672 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:41.340097 6672 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:41.340108 6672 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:41.340116 6672 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:41.340212 6672 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:41.340278 6672 factory.go:656] Stopping watch factory\\\\nI0316 00:08:41.340297 6672 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.363313 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.378642 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.390909 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxldb\" (UniqueName: \"kubernetes.io/projected/84360ef9-0450-44c5-80eb-eab1bf8e808b-kube-api-access-pxldb\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.391070 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.391481 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.391595 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:45.891542213 +0000 UTC m=+118.987842186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.392338 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.402062 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.410478 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxldb\" (UniqueName: \"kubernetes.io/projected/84360ef9-0450-44c5-80eb-eab1bf8e808b-kube-api-access-pxldb\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.413507 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.424163 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.428563 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.428585 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.428594 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.428607 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.428615 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.435857 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.454133 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.465416 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.476317 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.492522 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.510694 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.525750 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.530851 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.530898 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.530910 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.530928 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.530941 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.541057 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.553253 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.574433 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.587942 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.601171 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.618051 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.629160 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.632924 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.632973 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.632987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.633010 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.633025 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.678272 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.735694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.735755 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.735772 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.735798 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.735817 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.839008 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.839061 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.839077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.839101 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.839118 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.895414 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.895696 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.895795 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:46.895765188 +0000 UTC m=+119.992065171 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.942681 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.942741 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.942751 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.942773 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.942787 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.045138 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.045205 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.045222 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.045242 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.045254 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.148625 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.148683 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.148694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.148719 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.148732 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.251711 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.251769 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.251782 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.251803 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.251820 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.354895 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.354942 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.354956 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.354977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.354992 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.458035 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.458352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.458383 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.458412 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.458433 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.561420 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.561472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.561490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.561512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.561532 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.664137 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.664191 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.664209 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.664234 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.664251 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.666772 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.666874 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.666915 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.667106 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.667211 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.667246 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.667424 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.667711 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.766920 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.766978 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.766997 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.767034 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.767052 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.869955 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.869996 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.870004 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.870019 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.870031 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.905027 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.905171 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.905239 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:48.905223279 +0000 UTC m=+122.001523242 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.974036 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.974097 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.974114 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.974140 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.974159 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.077219 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.077274 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.077285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.077300 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.077309 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:47Z","lastTransitionTime":"2026-03-16T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.180368 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.180430 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.180463 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.180492 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.180512 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:47Z","lastTransitionTime":"2026-03-16T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.283736 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.283810 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.283834 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.283863 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.283885 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:47Z","lastTransitionTime":"2026-03-16T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.387078 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.387136 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.387152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.387174 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.387191 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:47Z","lastTransitionTime":"2026-03-16T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.489984 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.490050 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.490076 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.490103 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.490120 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:47Z","lastTransitionTime":"2026-03-16T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: E0316 00:08:47.590894 4816 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.689896 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.709836 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.747594 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.772056 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: E0316 00:08:47.784596 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.798270 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.819192 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.840451 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.856614 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.872631 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.897087 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.914322 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.933817 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.953141 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.970525 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.986382 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.995302 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.006988 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.667300 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.667359 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.667760 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.667914 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.668005 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.668061 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.668133 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.668227 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.668308 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.928440 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.928605 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.928667 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:52.928648623 +0000 UTC m=+126.024948576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.240116 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.240230 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.240253 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.240277 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.240294 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:49Z","lastTransitionTime":"2026-03-16T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.262302 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.267100 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.267168 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.267190 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.267217 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.267236 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:49Z","lastTransitionTime":"2026-03-16T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.276447 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.279447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5"} Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.279970 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.303999 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.306924 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.312915 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.312988 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.313007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.313034 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.313053 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:49Z","lastTransitionTime":"2026-03-16T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.321667 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.334520 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.339237 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.339280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.339297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.339320 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.339339 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:49Z","lastTransitionTime":"2026-03-16T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.343687 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.354888 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359448 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359488 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359504 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359526 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359542 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:49Z","lastTransitionTime":"2026-03-16T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359611 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.379200 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.379451 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.388190 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.401587 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.417945 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.440612 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.459294 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.472281 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.481770 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.495770 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.514487 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.527669 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.541918 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.559493 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.572643 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:50 crc kubenswrapper[4816]: I0316 00:08:50.667436 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:50 crc kubenswrapper[4816]: I0316 00:08:50.667526 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:50 crc kubenswrapper[4816]: I0316 00:08:50.667466 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:50 crc kubenswrapper[4816]: I0316 00:08:50.667466 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:50 crc kubenswrapper[4816]: E0316 00:08:50.667668 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:50 crc kubenswrapper[4816]: E0316 00:08:50.667787 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:50 crc kubenswrapper[4816]: E0316 00:08:50.667892 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:50 crc kubenswrapper[4816]: E0316 00:08:50.668003 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:52 crc kubenswrapper[4816]: I0316 00:08:52.666741 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:52 crc kubenswrapper[4816]: I0316 00:08:52.666764 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:52 crc kubenswrapper[4816]: I0316 00:08:52.666805 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.667431 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.667233 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:52 crc kubenswrapper[4816]: I0316 00:08:52.666896 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.667661 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.667765 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.785732 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:52 crc kubenswrapper[4816]: I0316 00:08:52.986618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.986895 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.987001 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:00.986974893 +0000 UTC m=+134.083274876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:54 crc kubenswrapper[4816]: I0316 00:08:54.666683 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:54 crc kubenswrapper[4816]: I0316 00:08:54.666750 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:54 crc kubenswrapper[4816]: I0316 00:08:54.666699 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:54 crc kubenswrapper[4816]: I0316 00:08:54.666683 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:54 crc kubenswrapper[4816]: E0316 00:08:54.666882 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:54 crc kubenswrapper[4816]: E0316 00:08:54.666999 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:54 crc kubenswrapper[4816]: E0316 00:08:54.667204 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:54 crc kubenswrapper[4816]: E0316 00:08:54.667338 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:56 crc kubenswrapper[4816]: I0316 00:08:56.667597 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:56 crc kubenswrapper[4816]: I0316 00:08:56.667723 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:56 crc kubenswrapper[4816]: E0316 00:08:56.668033 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:56 crc kubenswrapper[4816]: I0316 00:08:56.668085 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:56 crc kubenswrapper[4816]: I0316 00:08:56.668119 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:56 crc kubenswrapper[4816]: E0316 00:08:56.668257 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:56 crc kubenswrapper[4816]: E0316 00:08:56.668352 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:56 crc kubenswrapper[4816]: E0316 00:08:56.668486 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:56 crc kubenswrapper[4816]: I0316 00:08:56.669505 4816 scope.go:117] "RemoveContainer" containerID="f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.309474 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/1.log" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.311370 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e"} Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.311727 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.323571 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.336472 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.352821 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.364950 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.385474 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.397725 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.409045 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.434576 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.447569 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.461471 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.473616 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.492893 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.515432 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.530908 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.547592 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.572628 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.586865 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.691513 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.714948 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.730842 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.747358 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.767935 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.786493 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: E0316 00:08:57.786625 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.804616 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.828672 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.846639 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.865095 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.885007 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.905349 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.920932 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.949572 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.969708 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.986591 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.017407 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.317332 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/2.log" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.318326 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/1.log" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.322945 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" exitCode=1 Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.323016 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e"} Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.323070 4816 scope.go:117] "RemoveContainer" containerID="f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.324108 4816 scope.go:117] "RemoveContainer" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" Mar 16 00:08:58 crc kubenswrapper[4816]: E0316 00:08:58.324428 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.358169 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.382032 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.398014 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.418958 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.438221 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.459317 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.479796 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.495762 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.514802 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.531750 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.564491 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.598641 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.620229 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.640091 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.656632 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.667338 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.667445 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:58 crc kubenswrapper[4816]: E0316 00:08:58.667843 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.667912 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.668007 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:58 crc kubenswrapper[4816]: E0316 00:08:58.668183 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:58 crc kubenswrapper[4816]: E0316 00:08:58.668388 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:58 crc kubenswrapper[4816]: E0316 00:08:58.668609 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.681512 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.704036 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.330546 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/2.log" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.338168 4816 scope.go:117] "RemoveContainer" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.338630 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.358133 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.378728 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.397776 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.412425 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.451386 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.474800 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.490143 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.493873 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.493916 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.493933 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.493957 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.493974 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:59Z","lastTransitionTime":"2026-03-16T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.508988 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.512541 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.512608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.512619 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.512635 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.512647 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:59Z","lastTransitionTime":"2026-03-16T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.518269 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.525823 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.529749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.529797 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.529809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.529827 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.529839 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:59Z","lastTransitionTime":"2026-03-16T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.538971 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.542045 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.546302 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.546335 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.546346 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.546364 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.546376 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:59Z","lastTransitionTime":"2026-03-16T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.552095 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.560794 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.563275 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.565864 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.565907 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.565920 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.565938 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.565952 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:59Z","lastTransitionTime":"2026-03-16T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.576530 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.580661 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.580815 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.595935 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.608152 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.627018 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.648406 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.664745 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:00 crc kubenswrapper[4816]: I0316 00:09:00.667238 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:00 crc kubenswrapper[4816]: I0316 00:09:00.667323 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:00 crc kubenswrapper[4816]: E0316 00:09:00.667389 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:00 crc kubenswrapper[4816]: I0316 00:09:00.667437 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:00 crc kubenswrapper[4816]: E0316 00:09:00.667457 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:00 crc kubenswrapper[4816]: I0316 00:09:00.667317 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:00 crc kubenswrapper[4816]: E0316 00:09:00.667679 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:00 crc kubenswrapper[4816]: E0316 00:09:00.667916 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:00 crc kubenswrapper[4816]: I0316 00:09:00.682457 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.085499 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:01 crc kubenswrapper[4816]: E0316 00:09:01.085828 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:01 crc kubenswrapper[4816]: E0316 00:09:01.085951 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:17.085924331 +0000 UTC m=+150.182224314 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.299146 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.323479 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.345365 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.364503 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.383756 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.397386 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.413732 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.433948 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.452270 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.472305 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.488817 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.522082 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.545287 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.563831 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.594626 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.617986 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.639396 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.656007 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.676894 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4816]: I0316 00:09:02.667696 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:02 crc kubenswrapper[4816]: I0316 00:09:02.667724 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:02 crc kubenswrapper[4816]: E0316 00:09:02.668208 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:02 crc kubenswrapper[4816]: I0316 00:09:02.667787 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:02 crc kubenswrapper[4816]: I0316 00:09:02.667770 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:02 crc kubenswrapper[4816]: E0316 00:09:02.668395 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:02 crc kubenswrapper[4816]: E0316 00:09:02.668471 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:02 crc kubenswrapper[4816]: E0316 00:09:02.668609 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:02 crc kubenswrapper[4816]: E0316 00:09:02.788014 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:04 crc kubenswrapper[4816]: I0316 00:09:04.666931 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:04 crc kubenswrapper[4816]: I0316 00:09:04.667007 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:04 crc kubenswrapper[4816]: E0316 00:09:04.667111 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:04 crc kubenswrapper[4816]: I0316 00:09:04.667131 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:04 crc kubenswrapper[4816]: I0316 00:09:04.667157 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:04 crc kubenswrapper[4816]: E0316 00:09:04.667259 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:04 crc kubenswrapper[4816]: E0316 00:09:04.667354 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:04 crc kubenswrapper[4816]: E0316 00:09:04.667487 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:06 crc kubenswrapper[4816]: I0316 00:09:06.667045 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:06 crc kubenswrapper[4816]: I0316 00:09:06.667088 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:06 crc kubenswrapper[4816]: I0316 00:09:06.667105 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:06 crc kubenswrapper[4816]: I0316 00:09:06.667059 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:06 crc kubenswrapper[4816]: E0316 00:09:06.667385 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:06 crc kubenswrapper[4816]: E0316 00:09:06.667482 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:06 crc kubenswrapper[4816]: E0316 00:09:06.667335 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:06 crc kubenswrapper[4816]: E0316 00:09:06.667667 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.692388 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.711286 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.732251 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.756159 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.773199 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: E0316 00:09:07.789094 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.796527 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.811479 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.824753 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.838691 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.854933 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.882024 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.903457 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.921840 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.955418 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.981481 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.003088 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.020073 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.036377 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.666978 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.667008 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:08 crc kubenswrapper[4816]: E0316 00:09:08.667116 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.667167 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:08 crc kubenswrapper[4816]: E0316 00:09:08.667297 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.667316 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:08 crc kubenswrapper[4816]: E0316 00:09:08.667368 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:08 crc kubenswrapper[4816]: E0316 00:09:08.667421 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.836607 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.836675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.836692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.836718 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.836736 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:09Z","lastTransitionTime":"2026-03-16T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.858092 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:09Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.862070 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.862116 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.862135 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.862156 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.862173 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:09Z","lastTransitionTime":"2026-03-16T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.882014 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:09Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.887391 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.887744 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.887946 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.888145 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.888332 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:09Z","lastTransitionTime":"2026-03-16T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.907643 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:09Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.912696 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.912860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.912982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.913072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.913168 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:09Z","lastTransitionTime":"2026-03-16T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.928276 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:09Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.932455 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.932505 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.932520 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.932540 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.932586 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:09Z","lastTransitionTime":"2026-03-16T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.952629 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:09Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.952861 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:10 crc kubenswrapper[4816]: I0316 00:09:10.667251 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:10 crc kubenswrapper[4816]: I0316 00:09:10.667402 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:10 crc kubenswrapper[4816]: E0316 00:09:10.667626 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:10 crc kubenswrapper[4816]: I0316 00:09:10.667717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:10 crc kubenswrapper[4816]: I0316 00:09:10.667731 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:10 crc kubenswrapper[4816]: E0316 00:09:10.667917 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:10 crc kubenswrapper[4816]: E0316 00:09:10.668097 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:10 crc kubenswrapper[4816]: E0316 00:09:10.668611 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.514502 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.515200 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:16.515148953 +0000 UTC m=+209.611448946 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.616328 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.616786 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.616509 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.616966 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617008 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617032 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.616892 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617111 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:16.617052604 +0000 UTC m=+209.713352797 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617192 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:16.617160827 +0000 UTC m=+209.713460810 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.617242 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617335 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617389 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617445 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:16.617430745 +0000 UTC m=+209.713730728 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617402 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617657 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617738 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:16.617727334 +0000 UTC m=+209.714027287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.667362 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.667370 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.667399 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.667423 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.668086 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.668069 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.668247 4816 scope.go:117] "RemoveContainer" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.668261 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.668337 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.668542 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.791374 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:14 crc kubenswrapper[4816]: I0316 00:09:14.667479 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:14 crc kubenswrapper[4816]: I0316 00:09:14.667524 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:14 crc kubenswrapper[4816]: I0316 00:09:14.667529 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:14 crc kubenswrapper[4816]: I0316 00:09:14.667614 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:14 crc kubenswrapper[4816]: E0316 00:09:14.669297 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:14 crc kubenswrapper[4816]: E0316 00:09:14.669442 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:14 crc kubenswrapper[4816]: E0316 00:09:14.669588 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:14 crc kubenswrapper[4816]: E0316 00:09:14.669663 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:14 crc kubenswrapper[4816]: I0316 00:09:14.684674 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 16 00:09:16 crc kubenswrapper[4816]: I0316 00:09:16.666956 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:16 crc kubenswrapper[4816]: E0316 00:09:16.667197 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:16 crc kubenswrapper[4816]: I0316 00:09:16.667684 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:16 crc kubenswrapper[4816]: E0316 00:09:16.667829 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:16 crc kubenswrapper[4816]: I0316 00:09:16.667885 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:16 crc kubenswrapper[4816]: E0316 00:09:16.667964 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:16 crc kubenswrapper[4816]: I0316 00:09:16.669475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:16 crc kubenswrapper[4816]: E0316 00:09:16.669708 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.167542 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:17 crc kubenswrapper[4816]: E0316 00:09:17.167839 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:17 crc kubenswrapper[4816]: E0316 00:09:17.168008 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:49.167952979 +0000 UTC m=+182.264252972 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.681709 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.701732 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.720774 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.740257 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.756815 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.779228 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: E0316 00:09:17.792883 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.800838 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.823865 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.847080 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.860339 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.876507 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.894123 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.911802 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.928903 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.943638 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.975203 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.996347 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.015003 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.045910 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.667206 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.667342 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:18 crc kubenswrapper[4816]: E0316 00:09:18.667404 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.667223 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.667342 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:18 crc kubenswrapper[4816]: E0316 00:09:18.667585 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:18 crc kubenswrapper[4816]: E0316 00:09:18.667752 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:18 crc kubenswrapper[4816]: E0316 00:09:18.667887 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.420044 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/0.log" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.420121 4816 generic.go:334] "Generic (PLEG): container finished" podID="e9789e58-12c8-4831-9401-af48a3e92209" containerID="e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962" exitCode=1 Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.420172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerDied","Data":"e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962"} Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.420856 4816 scope.go:117] "RemoveContainer" containerID="e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.441478 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.461956 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.480680 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.493898 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.512629 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.536450 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.554609 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.575106 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.591354 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.603847 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.621963 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.638897 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.652019 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.670059 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.685254 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.709933 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.726870 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.741891 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.769423 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.135672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.135733 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.135750 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.135780 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.135797 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:20Z","lastTransitionTime":"2026-03-16T00:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.158421 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.164026 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.164081 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.164117 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.164146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.164166 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:20Z","lastTransitionTime":"2026-03-16T00:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.186852 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.191717 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.191763 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.191779 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.191802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.191819 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:20Z","lastTransitionTime":"2026-03-16T00:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.212081 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.216064 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.216143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.216166 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.216191 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.216207 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:20Z","lastTransitionTime":"2026-03-16T00:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.235703 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.240468 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.240545 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.240615 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.240659 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.240678 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:20Z","lastTransitionTime":"2026-03-16T00:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.267532 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.267833 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.427757 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/0.log" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.427858 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerStarted","Data":"b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26"} Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.452810 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.470619 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.488756 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.505908 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.527830 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.548582 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.583416 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.595691 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.611623 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.625839 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.644696 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.659686 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.667536 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.667615 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.667654 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.667569 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.667763 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.667906 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.668065 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.668186 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.672770 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.687852 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.703972 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.717538 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.744393 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.769953 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.786148 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:22 crc kubenswrapper[4816]: I0316 00:09:22.667255 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:22 crc kubenswrapper[4816]: E0316 00:09:22.667664 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:22 crc kubenswrapper[4816]: I0316 00:09:22.667431 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:22 crc kubenswrapper[4816]: E0316 00:09:22.667800 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:22 crc kubenswrapper[4816]: I0316 00:09:22.667340 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:22 crc kubenswrapper[4816]: E0316 00:09:22.667862 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:22 crc kubenswrapper[4816]: I0316 00:09:22.667462 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:22 crc kubenswrapper[4816]: E0316 00:09:22.667914 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:22 crc kubenswrapper[4816]: E0316 00:09:22.794914 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:24 crc kubenswrapper[4816]: I0316 00:09:24.666919 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:24 crc kubenswrapper[4816]: I0316 00:09:24.666974 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:24 crc kubenswrapper[4816]: I0316 00:09:24.667038 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:24 crc kubenswrapper[4816]: I0316 00:09:24.667049 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:24 crc kubenswrapper[4816]: E0316 00:09:24.667125 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:24 crc kubenswrapper[4816]: E0316 00:09:24.667219 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:24 crc kubenswrapper[4816]: E0316 00:09:24.667851 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:24 crc kubenswrapper[4816]: E0316 00:09:24.667994 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:24 crc kubenswrapper[4816]: I0316 00:09:24.669063 4816 scope.go:117] "RemoveContainer" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.452608 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/2.log" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.455997 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c"} Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.456464 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.470464 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.484096 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.495330 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.507519 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.525009 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.545007 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.555615 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.568248 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.582670 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.599071 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.617592 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.635416 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.650115 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.662190 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.676617 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.697011 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.715169 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.726155 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.736928 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.463660 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/3.log" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.464922 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/2.log" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.468788 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" exitCode=1 Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.468866 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c"} Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.468937 4816 scope.go:117] "RemoveContainer" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.470230 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:09:26 crc kubenswrapper[4816]: E0316 00:09:26.470586 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.506446 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.525660 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.543127 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.566474 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:25Z\\\",\\\"message\\\":\\\":[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0316 00:09:25.816315 7422 services_controller.go:443] Built service openshift-etcd-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.188\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0316 00:09:25.816331 7422 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5\\\\nI0316 00:09:25.816342 7422 services_controller.go:444] Built service openshift-etcd-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0316 00:09:25.815541 7422 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0316 00:09:25.815313 7422 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clus\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.581790 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.598115 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.615700 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.626946 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.643126 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.659610 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.667640 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:26 crc kubenswrapper[4816]: E0316 00:09:26.667788 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.667845 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.667978 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:26 crc kubenswrapper[4816]: E0316 00:09:26.668168 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.668514 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:26 crc kubenswrapper[4816]: E0316 00:09:26.668674 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:26 crc kubenswrapper[4816]: E0316 00:09:26.668839 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.674383 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.690368 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.717743 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.734232 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.751697 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.767123 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.786326 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.806985 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.825355 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.476025 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/3.log" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.481884 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:09:27 crc kubenswrapper[4816]: E0316 00:09:27.482180 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.502153 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.521530 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.546442 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.563877 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.589446 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.610781 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.629389 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.649465 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.668045 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.691674 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.713723 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.735592 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.768159 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:25Z\\\",\\\"message\\\":\\\":[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0316 00:09:25.816315 7422 services_controller.go:443] Built service openshift-etcd-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.188\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0316 00:09:25.816331 7422 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5\\\\nI0316 00:09:25.816342 7422 services_controller.go:444] Built service openshift-etcd-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0316 00:09:25.815541 7422 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0316 00:09:25.815313 7422 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clus\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: E0316 00:09:27.795982 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.807849 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.830785 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.850368 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.865219 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.880527 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.896471 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.917505 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.938637 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.957390 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.976835 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.994295 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.014987 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.033641 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.055228 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.069758 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.087888 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.102652 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.115263 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.126334 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.136790 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.147787 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.164158 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.176422 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.206058 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:25Z\\\",\\\"message\\\":\\\":[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0316 00:09:25.816315 7422 services_controller.go:443] Built service openshift-etcd-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.188\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0316 00:09:25.816331 7422 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5\\\\nI0316 00:09:25.816342 7422 services_controller.go:444] Built service openshift-etcd-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0316 00:09:25.815541 7422 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0316 00:09:25.815313 7422 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clus\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.225108 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.667228 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.667252 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.667321 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.667402 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:28 crc kubenswrapper[4816]: E0316 00:09:28.667951 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:28 crc kubenswrapper[4816]: E0316 00:09:28.668271 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:28 crc kubenswrapper[4816]: E0316 00:09:28.668433 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:28 crc kubenswrapper[4816]: E0316 00:09:28.668537 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.626334 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.626393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.626418 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.626446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.626467 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:30Z","lastTransitionTime":"2026-03-16T00:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.648757 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:30Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.655530 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.655818 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.655994 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.656158 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.656302 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:30Z","lastTransitionTime":"2026-03-16T00:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.667299 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.667416 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.667374 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.667528 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.667321 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.667898 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.668293 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.668623 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.676786 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:30Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.681948 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.682004 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.682031 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.682060 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.682081 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:30Z","lastTransitionTime":"2026-03-16T00:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.702992 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:30Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.708664 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.708719 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.708735 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.708763 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.708780 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:30Z","lastTransitionTime":"2026-03-16T00:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.729509 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:30Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.735198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.735261 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.735285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.735309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.735326 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:30Z","lastTransitionTime":"2026-03-16T00:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.754140 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:30Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.754372 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:32 crc kubenswrapper[4816]: I0316 00:09:32.667653 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:32 crc kubenswrapper[4816]: I0316 00:09:32.667692 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:32 crc kubenswrapper[4816]: I0316 00:09:32.667760 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:32 crc kubenswrapper[4816]: I0316 00:09:32.667891 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:32 crc kubenswrapper[4816]: E0316 00:09:32.667887 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:32 crc kubenswrapper[4816]: E0316 00:09:32.667987 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:32 crc kubenswrapper[4816]: E0316 00:09:32.668202 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:32 crc kubenswrapper[4816]: E0316 00:09:32.668318 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:32 crc kubenswrapper[4816]: E0316 00:09:32.797288 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:34 crc kubenswrapper[4816]: I0316 00:09:34.667285 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:34 crc kubenswrapper[4816]: E0316 00:09:34.667692 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:34 crc kubenswrapper[4816]: I0316 00:09:34.667429 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:34 crc kubenswrapper[4816]: I0316 00:09:34.667382 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:34 crc kubenswrapper[4816]: E0316 00:09:34.667775 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:34 crc kubenswrapper[4816]: I0316 00:09:34.667432 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:34 crc kubenswrapper[4816]: E0316 00:09:34.667996 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:34 crc kubenswrapper[4816]: E0316 00:09:34.668173 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:36 crc kubenswrapper[4816]: I0316 00:09:36.667202 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:36 crc kubenswrapper[4816]: I0316 00:09:36.667290 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:36 crc kubenswrapper[4816]: E0316 00:09:36.667372 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:36 crc kubenswrapper[4816]: E0316 00:09:36.667498 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:36 crc kubenswrapper[4816]: I0316 00:09:36.667616 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:36 crc kubenswrapper[4816]: E0316 00:09:36.667703 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:36 crc kubenswrapper[4816]: I0316 00:09:36.667757 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:36 crc kubenswrapper[4816]: E0316 00:09:36.667837 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.690734 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.712804 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.728847 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.746883 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.758282 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.775269 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.793112 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: E0316 00:09:37.798366 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.811430 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.827123 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.840841 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.859307 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.879842 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.898037 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.934099 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:25Z\\\",\\\"message\\\":\\\":[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0316 00:09:25.816315 7422 services_controller.go:443] Built service openshift-etcd-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.188\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0316 00:09:25.816331 7422 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5\\\\nI0316 00:09:25.816342 7422 services_controller.go:444] Built service openshift-etcd-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0316 00:09:25.815541 7422 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0316 00:09:25.815313 7422 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clus\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.950234 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.971616 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.995724 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.011905 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.027754 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.667717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.667785 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.667809 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.667717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:38 crc kubenswrapper[4816]: E0316 00:09:38.668006 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:38 crc kubenswrapper[4816]: E0316 00:09:38.668144 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:38 crc kubenswrapper[4816]: E0316 00:09:38.668290 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:38 crc kubenswrapper[4816]: E0316 00:09:38.668405 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:39 crc kubenswrapper[4816]: I0316 00:09:39.668355 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:09:39 crc kubenswrapper[4816]: E0316 00:09:39.668542 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:09:40 crc kubenswrapper[4816]: I0316 00:09:40.667056 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:40 crc kubenswrapper[4816]: I0316 00:09:40.667181 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:40 crc kubenswrapper[4816]: E0316 00:09:40.667292 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:40 crc kubenswrapper[4816]: I0316 00:09:40.667346 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:40 crc kubenswrapper[4816]: E0316 00:09:40.667433 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:40 crc kubenswrapper[4816]: I0316 00:09:40.667785 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:40 crc kubenswrapper[4816]: E0316 00:09:40.667738 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:40 crc kubenswrapper[4816]: E0316 00:09:40.667979 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.129203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.129292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.129311 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.129385 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.129405 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:41Z","lastTransitionTime":"2026-03-16T00:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.151107 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.156970 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.157029 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.157048 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.157122 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.157143 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:41Z","lastTransitionTime":"2026-03-16T00:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.175799 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.180624 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.180720 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.180758 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.180791 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.180815 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:41Z","lastTransitionTime":"2026-03-16T00:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.199668 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.205017 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.205077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.205096 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.205123 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.205143 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:41Z","lastTransitionTime":"2026-03-16T00:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.223915 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.228379 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.228421 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.228433 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.228458 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.228472 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:41Z","lastTransitionTime":"2026-03-16T00:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.245928 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.246175 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:42 crc kubenswrapper[4816]: I0316 00:09:42.666717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:42 crc kubenswrapper[4816]: I0316 00:09:42.666782 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:42 crc kubenswrapper[4816]: I0316 00:09:42.666731 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:42 crc kubenswrapper[4816]: I0316 00:09:42.666726 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:42 crc kubenswrapper[4816]: E0316 00:09:42.667230 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:42 crc kubenswrapper[4816]: E0316 00:09:42.667049 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:42 crc kubenswrapper[4816]: E0316 00:09:42.667425 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:42 crc kubenswrapper[4816]: E0316 00:09:42.667928 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:42 crc kubenswrapper[4816]: E0316 00:09:42.800623 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:44 crc kubenswrapper[4816]: I0316 00:09:44.667675 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:44 crc kubenswrapper[4816]: I0316 00:09:44.667769 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:44 crc kubenswrapper[4816]: I0316 00:09:44.667810 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:44 crc kubenswrapper[4816]: E0316 00:09:44.668065 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:44 crc kubenswrapper[4816]: I0316 00:09:44.668136 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:44 crc kubenswrapper[4816]: E0316 00:09:44.668282 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:44 crc kubenswrapper[4816]: E0316 00:09:44.668400 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:44 crc kubenswrapper[4816]: E0316 00:09:44.668908 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:46 crc kubenswrapper[4816]: I0316 00:09:46.667005 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:46 crc kubenswrapper[4816]: I0316 00:09:46.667080 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:46 crc kubenswrapper[4816]: E0316 00:09:46.667236 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:46 crc kubenswrapper[4816]: I0316 00:09:46.667294 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:46 crc kubenswrapper[4816]: I0316 00:09:46.667342 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:46 crc kubenswrapper[4816]: E0316 00:09:46.667536 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:46 crc kubenswrapper[4816]: E0316 00:09:46.667679 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:46 crc kubenswrapper[4816]: E0316 00:09:46.667763 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.702111 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.725893 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.744387 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.784076 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:25Z\\\",\\\"message\\\":\\\":[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0316 00:09:25.816315 7422 services_controller.go:443] Built service openshift-etcd-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.188\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0316 00:09:25.816331 7422 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5\\\\nI0316 00:09:25.816342 7422 services_controller.go:444] Built service openshift-etcd-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0316 00:09:25.815541 7422 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0316 00:09:25.815313 7422 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clus\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: E0316 00:09:47.801620 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.805434 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.833963 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.856602 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.877336 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.897353 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.923267 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.940282 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.958802 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.980129 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.997712 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.019205 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.040047 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.058655 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.075992 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.090696 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.667418 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.667428 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:48 crc kubenswrapper[4816]: E0316 00:09:48.667628 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:48 crc kubenswrapper[4816]: E0316 00:09:48.667740 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.667344 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:48 crc kubenswrapper[4816]: E0316 00:09:48.668146 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.668368 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:48 crc kubenswrapper[4816]: E0316 00:09:48.668664 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:49 crc kubenswrapper[4816]: I0316 00:09:49.225449 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:49 crc kubenswrapper[4816]: E0316 00:09:49.225637 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:49 crc kubenswrapper[4816]: E0316 00:09:49.225689 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:53.225674012 +0000 UTC m=+246.321973975 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:50 crc kubenswrapper[4816]: I0316 00:09:50.666890 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:50 crc kubenswrapper[4816]: I0316 00:09:50.666886 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:50 crc kubenswrapper[4816]: I0316 00:09:50.666895 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:50 crc kubenswrapper[4816]: I0316 00:09:50.666776 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:50 crc kubenswrapper[4816]: E0316 00:09:50.667491 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:50 crc kubenswrapper[4816]: E0316 00:09:50.667714 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:50 crc kubenswrapper[4816]: E0316 00:09:50.667844 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:50 crc kubenswrapper[4816]: E0316 00:09:50.667911 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:50 crc kubenswrapper[4816]: I0316 00:09:50.669056 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:09:50 crc kubenswrapper[4816]: E0316 00:09:50.669324 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.259520 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.259605 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.259630 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.259654 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.259670 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:51Z","lastTransitionTime":"2026-03-16T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.281183 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.286906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.286979 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.286998 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.287025 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.287044 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:51Z","lastTransitionTime":"2026-03-16T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.306503 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.311272 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.311328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.311357 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.311389 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.311412 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:51Z","lastTransitionTime":"2026-03-16T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.330307 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.335940 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.335990 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.336007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.336030 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.336046 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:51Z","lastTransitionTime":"2026-03-16T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.354623 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.360657 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.360770 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.360793 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.360827 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.360849 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:51Z","lastTransitionTime":"2026-03-16T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.383287 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.383533 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:52 crc kubenswrapper[4816]: I0316 00:09:52.666892 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:52 crc kubenswrapper[4816]: I0316 00:09:52.666951 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:52 crc kubenswrapper[4816]: I0316 00:09:52.667024 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:52 crc kubenswrapper[4816]: E0316 00:09:52.667138 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:52 crc kubenswrapper[4816]: I0316 00:09:52.667251 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:52 crc kubenswrapper[4816]: E0316 00:09:52.667375 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:52 crc kubenswrapper[4816]: E0316 00:09:52.667477 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:52 crc kubenswrapper[4816]: E0316 00:09:52.667657 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:52 crc kubenswrapper[4816]: E0316 00:09:52.803048 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:54 crc kubenswrapper[4816]: I0316 00:09:54.667167 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:54 crc kubenswrapper[4816]: I0316 00:09:54.667205 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:54 crc kubenswrapper[4816]: I0316 00:09:54.667244 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:54 crc kubenswrapper[4816]: E0316 00:09:54.667367 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:54 crc kubenswrapper[4816]: I0316 00:09:54.667398 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:54 crc kubenswrapper[4816]: E0316 00:09:54.667764 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:54 crc kubenswrapper[4816]: E0316 00:09:54.667543 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:54 crc kubenswrapper[4816]: E0316 00:09:54.667869 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:56 crc kubenswrapper[4816]: I0316 00:09:56.666684 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:56 crc kubenswrapper[4816]: I0316 00:09:56.666734 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:56 crc kubenswrapper[4816]: I0316 00:09:56.666754 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:56 crc kubenswrapper[4816]: I0316 00:09:56.666691 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:56 crc kubenswrapper[4816]: E0316 00:09:56.666911 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:56 crc kubenswrapper[4816]: E0316 00:09:56.667128 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:56 crc kubenswrapper[4816]: E0316 00:09:56.667495 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:56 crc kubenswrapper[4816]: E0316 00:09:56.667751 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.724597 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=57.724516899 podStartE2EDuration="57.724516899s" podCreationTimestamp="2026-03-16 00:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.703988684 +0000 UTC m=+190.800288667" watchObservedRunningTime="2026-03-16 00:09:57.724516899 +0000 UTC m=+190.820816892" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.724957 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=72.724945531 podStartE2EDuration="1m12.724945531s" podCreationTimestamp="2026-03-16 00:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.724928961 +0000 UTC m=+190.821228954" watchObservedRunningTime="2026-03-16 00:09:57.724945531 +0000 UTC m=+190.821245514" Mar 16 00:09:57 crc kubenswrapper[4816]: E0316 00:09:57.803950 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.809339 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=96.809309155 podStartE2EDuration="1m36.809309155s" podCreationTimestamp="2026-03-16 00:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.807634906 +0000 UTC m=+190.903934939" watchObservedRunningTime="2026-03-16 00:09:57.809309155 +0000 UTC m=+190.905609188" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.853688 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-szscw" podStartSLOduration=140.853661701 podStartE2EDuration="2m20.853661701s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.832423276 +0000 UTC m=+190.928723279" watchObservedRunningTime="2026-03-16 00:09:57.853661701 +0000 UTC m=+190.949961694" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.895310 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podStartSLOduration=140.895280856 podStartE2EDuration="2m20.895280856s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.853133555 +0000 UTC m=+190.949433578" watchObservedRunningTime="2026-03-16 00:09:57.895280856 +0000 UTC m=+190.991580849" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.912338 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=43.912312448 podStartE2EDuration="43.912312448s" podCreationTimestamp="2026-03-16 00:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.910946597 +0000 UTC m=+191.007246590" watchObservedRunningTime="2026-03-16 00:09:57.912312448 +0000 UTC m=+191.008612441" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.966899 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cnhkf" podStartSLOduration=140.966864814 podStartE2EDuration="2m20.966864814s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.966842233 +0000 UTC m=+191.063142226" watchObservedRunningTime="2026-03-16 00:09:57.966864814 +0000 UTC m=+191.063164807" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.992077 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" podStartSLOduration=139.992054035 podStartE2EDuration="2m19.992054035s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.991041686 +0000 UTC m=+191.087341669" watchObservedRunningTime="2026-03-16 00:09:57.992054035 +0000 UTC m=+191.088354008" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.013596 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=98.013541158 podStartE2EDuration="1m38.013541158s" podCreationTimestamp="2026-03-16 00:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:58.013148596 +0000 UTC m=+191.109448549" watchObservedRunningTime="2026-03-16 00:09:58.013541158 +0000 UTC m=+191.109841151" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.085963 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" podStartSLOduration=141.08594475 podStartE2EDuration="2m21.08594475s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:58.085198718 +0000 UTC m=+191.181498681" watchObservedRunningTime="2026-03-16 00:09:58.08594475 +0000 UTC m=+191.182244703" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.667139 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:58 crc kubenswrapper[4816]: E0316 00:09:58.667320 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.667704 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.667857 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:58 crc kubenswrapper[4816]: E0316 00:09:58.668006 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.668250 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:58 crc kubenswrapper[4816]: E0316 00:09:58.668256 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:58 crc kubenswrapper[4816]: E0316 00:09:58.668829 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:00 crc kubenswrapper[4816]: I0316 00:10:00.667204 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:00 crc kubenswrapper[4816]: I0316 00:10:00.667814 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:00 crc kubenswrapper[4816]: I0316 00:10:00.667836 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:00 crc kubenswrapper[4816]: E0316 00:10:00.667945 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:00 crc kubenswrapper[4816]: I0316 00:10:00.668252 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:00 crc kubenswrapper[4816]: E0316 00:10:00.668346 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:00 crc kubenswrapper[4816]: E0316 00:10:00.668599 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:00 crc kubenswrapper[4816]: E0316 00:10:00.668796 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.483891 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.483960 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.483984 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.484013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.484030 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:10:01Z","lastTransitionTime":"2026-03-16T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.550921 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lhpbn" podStartSLOduration=144.550890242 podStartE2EDuration="2m24.550890242s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:58.105763483 +0000 UTC m=+191.202063436" watchObservedRunningTime="2026-03-16 00:10:01.550890242 +0000 UTC m=+194.647190235" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.552825 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr"] Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.553319 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.558458 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.558458 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.558887 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.559032 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.668722 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:10:01 crc kubenswrapper[4816]: E0316 00:10:01.669049 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.671788 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.672158 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c33fcbae-202d-40ad-a561-e15eddf3cb4c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.672379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c33fcbae-202d-40ad-a561-e15eddf3cb4c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.672688 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.672996 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c33fcbae-202d-40ad-a561-e15eddf3cb4c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.712243 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.721936 4816 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c33fcbae-202d-40ad-a561-e15eddf3cb4c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774812 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c33fcbae-202d-40ad-a561-e15eddf3cb4c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774843 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c33fcbae-202d-40ad-a561-e15eddf3cb4c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774872 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774928 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.775137 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.776127 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c33fcbae-202d-40ad-a561-e15eddf3cb4c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.784846 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c33fcbae-202d-40ad-a561-e15eddf3cb4c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.796254 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c33fcbae-202d-40ad-a561-e15eddf3cb4c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.876370 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.614100 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" event={"ID":"c33fcbae-202d-40ad-a561-e15eddf3cb4c","Type":"ContainerStarted","Data":"f1a3b2e86078770d109d211ccf2fee60918ab6e50d5b768bd6a9ff73c900fcdd"} Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.614465 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" event={"ID":"c33fcbae-202d-40ad-a561-e15eddf3cb4c","Type":"ContainerStarted","Data":"be6102f089f6cd1ae95738a8b57c8ba13e4c5558709a37a45d02234f149ca5ce"} Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.633868 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" podStartSLOduration=145.633846616 podStartE2EDuration="2m25.633846616s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:02.632718093 +0000 UTC m=+195.729018086" watchObservedRunningTime="2026-03-16 00:10:02.633846616 +0000 UTC m=+195.730146609" Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.666824 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.667330 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.667242 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.667207 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:02 crc kubenswrapper[4816]: E0316 00:10:02.667854 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:02 crc kubenswrapper[4816]: E0316 00:10:02.668229 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:02 crc kubenswrapper[4816]: E0316 00:10:02.668316 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:02 crc kubenswrapper[4816]: E0316 00:10:02.668409 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:02 crc kubenswrapper[4816]: E0316 00:10:02.805608 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:10:04 crc kubenswrapper[4816]: I0316 00:10:04.667328 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:04 crc kubenswrapper[4816]: I0316 00:10:04.667384 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:04 crc kubenswrapper[4816]: E0316 00:10:04.667452 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:04 crc kubenswrapper[4816]: I0316 00:10:04.667339 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:04 crc kubenswrapper[4816]: E0316 00:10:04.667667 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:04 crc kubenswrapper[4816]: E0316 00:10:04.667719 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:04 crc kubenswrapper[4816]: I0316 00:10:04.668338 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:04 crc kubenswrapper[4816]: E0316 00:10:04.668639 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.627222 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/1.log" Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.627972 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/0.log" Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.628054 4816 generic.go:334] "Generic (PLEG): container finished" podID="e9789e58-12c8-4831-9401-af48a3e92209" containerID="b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26" exitCode=1 Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.628102 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerDied","Data":"b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26"} Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.628149 4816 scope.go:117] "RemoveContainer" containerID="e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962" Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.628928 4816 scope.go:117] "RemoveContainer" containerID="b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26" Mar 16 00:10:05 crc kubenswrapper[4816]: E0316 00:10:05.629294 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-szscw_openshift-multus(e9789e58-12c8-4831-9401-af48a3e92209)\"" pod="openshift-multus/multus-szscw" podUID="e9789e58-12c8-4831-9401-af48a3e92209" Mar 16 00:10:06 crc kubenswrapper[4816]: I0316 00:10:06.634314 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/1.log" Mar 16 00:10:06 crc kubenswrapper[4816]: I0316 00:10:06.667148 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:06 crc kubenswrapper[4816]: I0316 00:10:06.667284 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:06 crc kubenswrapper[4816]: I0316 00:10:06.667327 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:06 crc kubenswrapper[4816]: I0316 00:10:06.667389 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:06 crc kubenswrapper[4816]: E0316 00:10:06.667386 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:06 crc kubenswrapper[4816]: E0316 00:10:06.667587 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:06 crc kubenswrapper[4816]: E0316 00:10:06.667637 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:06 crc kubenswrapper[4816]: E0316 00:10:06.668038 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:07 crc kubenswrapper[4816]: E0316 00:10:07.806799 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:10:08 crc kubenswrapper[4816]: I0316 00:10:08.667545 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:08 crc kubenswrapper[4816]: I0316 00:10:08.667617 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:08 crc kubenswrapper[4816]: I0316 00:10:08.667682 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:08 crc kubenswrapper[4816]: E0316 00:10:08.667809 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:08 crc kubenswrapper[4816]: I0316 00:10:08.667832 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:08 crc kubenswrapper[4816]: E0316 00:10:08.667980 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:08 crc kubenswrapper[4816]: E0316 00:10:08.668069 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:08 crc kubenswrapper[4816]: E0316 00:10:08.668160 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:10 crc kubenswrapper[4816]: I0316 00:10:10.667432 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:10 crc kubenswrapper[4816]: I0316 00:10:10.667431 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:10 crc kubenswrapper[4816]: E0316 00:10:10.667668 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:10 crc kubenswrapper[4816]: I0316 00:10:10.667441 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:10 crc kubenswrapper[4816]: E0316 00:10:10.667871 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:10 crc kubenswrapper[4816]: E0316 00:10:10.667724 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:10 crc kubenswrapper[4816]: I0316 00:10:10.668416 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:10 crc kubenswrapper[4816]: E0316 00:10:10.668627 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:12 crc kubenswrapper[4816]: I0316 00:10:12.666696 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:12 crc kubenswrapper[4816]: I0316 00:10:12.666756 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:12 crc kubenswrapper[4816]: E0316 00:10:12.666860 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:12 crc kubenswrapper[4816]: I0316 00:10:12.666714 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:12 crc kubenswrapper[4816]: I0316 00:10:12.670711 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:10:12 crc kubenswrapper[4816]: I0316 00:10:12.671675 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:12 crc kubenswrapper[4816]: E0316 00:10:12.672138 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:12 crc kubenswrapper[4816]: E0316 00:10:12.672946 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:12 crc kubenswrapper[4816]: E0316 00:10:12.673271 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:12 crc kubenswrapper[4816]: E0316 00:10:12.807954 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.619834 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jqsjn"] Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.663805 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/3.log" Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.666792 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:13 crc kubenswrapper[4816]: E0316 00:10:13.666965 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.672015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"7e5d87dc1889484bb2175c0613eda0b852c65a289a1c165f6adae2a822892aa2"} Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.672622 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.716813 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podStartSLOduration=156.716792595 podStartE2EDuration="2m36.716792595s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:13.716124667 +0000 UTC m=+206.812424630" watchObservedRunningTime="2026-03-16 00:10:13.716792595 +0000 UTC m=+206.813092548" Mar 16 00:10:14 crc kubenswrapper[4816]: I0316 00:10:14.667310 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:14 crc kubenswrapper[4816]: I0316 00:10:14.667345 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:14 crc kubenswrapper[4816]: I0316 00:10:14.667375 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:14 crc kubenswrapper[4816]: E0316 00:10:14.667514 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:14 crc kubenswrapper[4816]: E0316 00:10:14.667753 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:14 crc kubenswrapper[4816]: E0316 00:10:14.667921 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:15 crc kubenswrapper[4816]: I0316 00:10:15.667322 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:15 crc kubenswrapper[4816]: E0316 00:10:15.667536 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.591617 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.591944 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:12:18.591894213 +0000 UTC m=+331.688194166 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.667659 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.667779 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.667666 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.667899 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.668016 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.668249 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.692895 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.692982 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.693033 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.693070 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693112 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693143 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693211 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:12:18.693188919 +0000 UTC m=+331.789488932 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693236 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:12:18.69322644 +0000 UTC m=+331.789526493 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693249 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693279 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693298 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693348 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693407 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693425 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693373 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:12:18.693351893 +0000 UTC m=+331.789651886 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693534 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:12:18.693508898 +0000 UTC m=+331.789808851 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:10:17 crc kubenswrapper[4816]: I0316 00:10:17.667257 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:17 crc kubenswrapper[4816]: E0316 00:10:17.669930 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:17 crc kubenswrapper[4816]: E0316 00:10:17.808844 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:10:18 crc kubenswrapper[4816]: I0316 00:10:18.667034 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:18 crc kubenswrapper[4816]: I0316 00:10:18.667211 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:18 crc kubenswrapper[4816]: I0316 00:10:18.667348 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:18 crc kubenswrapper[4816]: E0316 00:10:18.667340 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:18 crc kubenswrapper[4816]: E0316 00:10:18.667524 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:18 crc kubenswrapper[4816]: E0316 00:10:18.667874 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:18 crc kubenswrapper[4816]: I0316 00:10:18.668179 4816 scope.go:117] "RemoveContainer" containerID="b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26" Mar 16 00:10:19 crc kubenswrapper[4816]: I0316 00:10:19.666694 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:19 crc kubenswrapper[4816]: E0316 00:10:19.667238 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:19 crc kubenswrapper[4816]: I0316 00:10:19.686448 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/1.log" Mar 16 00:10:19 crc kubenswrapper[4816]: I0316 00:10:19.686531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerStarted","Data":"707ec2df051aa6206ac2bc1c4db6b5fe6b37467b90b6ee42dbf28f2b88e5d6e6"} Mar 16 00:10:20 crc kubenswrapper[4816]: I0316 00:10:20.667287 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:20 crc kubenswrapper[4816]: I0316 00:10:20.667372 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:20 crc kubenswrapper[4816]: I0316 00:10:20.667323 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:20 crc kubenswrapper[4816]: E0316 00:10:20.667518 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:20 crc kubenswrapper[4816]: E0316 00:10:20.667741 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:20 crc kubenswrapper[4816]: E0316 00:10:20.667875 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:21 crc kubenswrapper[4816]: I0316 00:10:21.666885 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:21 crc kubenswrapper[4816]: E0316 00:10:21.667507 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:22 crc kubenswrapper[4816]: I0316 00:10:22.667010 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:22 crc kubenswrapper[4816]: E0316 00:10:22.667228 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:22 crc kubenswrapper[4816]: I0316 00:10:22.667051 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:22 crc kubenswrapper[4816]: E0316 00:10:22.667419 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:22 crc kubenswrapper[4816]: I0316 00:10:22.667011 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:22 crc kubenswrapper[4816]: E0316 00:10:22.667516 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:23 crc kubenswrapper[4816]: I0316 00:10:23.666907 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:23 crc kubenswrapper[4816]: I0316 00:10:23.668993 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 16 00:10:23 crc kubenswrapper[4816]: I0316 00:10:23.669627 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.666918 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.666917 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.667110 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.669865 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.669912 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.669918 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.670226 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 16 00:10:31 crc kubenswrapper[4816]: I0316 00:10:31.944937 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 16 00:10:31 crc kubenswrapper[4816]: I0316 00:10:31.993869 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nnqsw"] Mar 16 00:10:31 crc kubenswrapper[4816]: I0316 00:10:31.994439 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.000177 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6r96v"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.000510 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.001603 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.001756 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.001766 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.001880 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.002091 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.002278 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.002405 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.002464 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.004088 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9xv4p"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.004694 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.005720 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.006470 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.007438 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2l7nk"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.007878 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.008427 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.008786 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.012383 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pdm8d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.012943 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fnmb9"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.013161 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.014110 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.014519 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.014702 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.015265 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.015791 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.016351 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.017194 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.017706 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.034745 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036052 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036240 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036446 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036682 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036955 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.037443 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.037577 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.037860 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.038236 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036462 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.037476 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.037513 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.038783 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.039077 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049256 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049430 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049562 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049681 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049817 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049885 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049940 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049944 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049892 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050151 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050179 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050244 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050310 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050595 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050637 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050699 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050721 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051288 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051358 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r9tr\" (UniqueName: \"kubernetes.io/projected/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-kube-api-access-9r9tr\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051295 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051544 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-encryption-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051595 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051709 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgff\" (UniqueName: \"kubernetes.io/projected/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-kube-api-access-vxgff\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051731 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-service-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051742 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051747 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit-dir\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051765 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051821 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051889 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-serving-cert\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051912 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-node-pullsecrets\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051929 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-serving-cert\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051943 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051960 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-image-import-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051968 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051987 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052008 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-client\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052022 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051343 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052090 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052037 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-client\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051448 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052153 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-config\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051504 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052248 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052260 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052322 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052354 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052376 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052457 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052617 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.053434 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q9xc9"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.053961 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.054382 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.054860 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29560320-s9q72"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.054932 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.055471 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.055837 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.055993 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.056309 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.057294 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.057639 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5rr7c"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.058149 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.058871 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.058976 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.059116 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.059174 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.059237 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.059934 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.060202 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.060599 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.061935 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.062045 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.062437 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l648b"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.062964 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.063238 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.063541 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.064189 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.064274 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.064903 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.067149 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.068052 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.068949 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.069746 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.070377 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.071139 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.072627 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.073216 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.073594 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gvk75"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.074074 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.079112 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.080596 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.081032 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.081178 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.081484 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.081516 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.081693 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.082519 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.091894 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.092264 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.095523 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.095824 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.096097 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.097314 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.097723 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.097924 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.098092 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.098388 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.099608 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.110195 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.111061 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.111362 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.111365 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.111738 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.118856 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.130502 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.131097 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.131286 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.137256 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.137821 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138222 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138345 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138492 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138618 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138726 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138842 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.139886 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.140081 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.140328 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.142161 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.142335 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.143839 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.144074 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.144157 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.144333 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.144943 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.145123 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.145141 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.145222 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.146106 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.145285 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.148826 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.149579 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560330-44pts"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.150417 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.152745 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mplx7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.153619 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-oauth-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.153732 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.153808 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.153884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgff\" (UniqueName: \"kubernetes.io/projected/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-kube-api-access-vxgff\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.153950 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6nkm6"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154021 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-service-ca\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154092 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-config\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154159 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-service-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154229 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnq8l\" (UniqueName: \"kubernetes.io/projected/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-kube-api-access-xnq8l\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154302 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e41d768-3ed4-4760-a0d5-4308d7b13379-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154371 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit-dir\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154459 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154566 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154677 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/044562bd-df74-47fa-bc8d-1c652233e9c5-serving-cert\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154752 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154800 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzj4l\" (UniqueName: \"kubernetes.io/projected/044562bd-df74-47fa-bc8d-1c652233e9c5-kube-api-access-pzj4l\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154891 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1306b657-0022-435d-bb72-793f1c1a106b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154917 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e41d768-3ed4-4760-a0d5-4308d7b13379-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155058 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdmv5\" (UniqueName: \"kubernetes.io/projected/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-kube-api-access-xdmv5\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155100 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155119 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-dir\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155137 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcsz7\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-kube-api-access-tcsz7\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155166 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-serving-cert\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155196 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/005900fa-b395-4c1c-8e62-8e975bd0393c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155210 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-oauth-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155226 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-machine-approver-tls\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155600 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155965 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.156204 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-service-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154484 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.156296 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit-dir\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154513 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.156438 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-node-pullsecrets\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159007 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159111 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-serving-cert\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159152 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159170 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-config\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159209 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/005900fa-b395-4c1c-8e62-8e975bd0393c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159225 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-serving-cert\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159242 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-image-import-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159258 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159276 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkbl4\" (UniqueName: \"kubernetes.io/projected/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-kube-api-access-bkbl4\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159347 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159368 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fht74\" (UniqueName: \"kubernetes.io/projected/005900fa-b395-4c1c-8e62-8e975bd0393c-kube-api-access-fht74\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159395 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-client\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159412 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159427 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-client\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159442 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-policies\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159461 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvtdx\" (UniqueName: \"kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159477 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159494 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-auth-proxy-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159507 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159521 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159538 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159572 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zhnp\" (UniqueName: \"kubernetes.io/projected/dced2102-9fd0-4300-9e0a-35d915f1caad-kube-api-access-2zhnp\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159588 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-client\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159607 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-config\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159623 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vhmx\" (UniqueName: \"kubernetes.io/projected/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-kube-api-access-9vhmx\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159641 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-images\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159656 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-metrics-tls\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159678 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r9tr\" (UniqueName: \"kubernetes.io/projected/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-kube-api-access-9r9tr\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159693 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159708 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-encryption-config\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159723 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-service-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-encryption-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159757 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jc6t\" (UniqueName: \"kubernetes.io/projected/1306b657-0022-435d-bb72-793f1c1a106b-kube-api-access-2jc6t\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159774 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-trusted-ca-bundle\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.156476 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-node-pullsecrets\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.156597 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.161831 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.162082 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-config\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.162294 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.163122 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.164224 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.164956 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-image-import-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.166046 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.168005 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-serving-cert\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.168352 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2k6jt"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.168968 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2l7nk"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.168996 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6r96v"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.169078 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.170079 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-client\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.173115 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-client\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.176097 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.179692 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.191991 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-encryption-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.192416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-serving-cert\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.202363 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.204962 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.211005 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.211179 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.211359 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nnqsw"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.217015 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.217085 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.225190 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.229077 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q9xc9"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.236676 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.237206 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.237787 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.241003 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.244297 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.246772 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.248938 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9xv4p"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.250609 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fnmb9"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.252387 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.253494 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5rr7c"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.254409 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.256052 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.256785 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.257702 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.258062 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.260954 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pdm8d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.261585 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.262845 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263262 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zhnp\" (UniqueName: \"kubernetes.io/projected/dced2102-9fd0-4300-9e0a-35d915f1caad-kube-api-access-2zhnp\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263291 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-images\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263310 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-client\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263326 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263342 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263357 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-images\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263371 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-metrics-tls\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263386 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spn2v\" (UniqueName: \"kubernetes.io/projected/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-kube-api-access-spn2v\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263404 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-metrics-certs\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263420 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwn7z\" (UniqueName: \"kubernetes.io/projected/79ec9746-96c0-4fcd-b367-a42b6950145a-kube-api-access-wwn7z\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263436 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-service-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263454 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263467 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-trusted-ca-bundle\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263481 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f90d894-17c6-4800-a438-737fe8619e01-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263498 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5rd\" (UniqueName: \"kubernetes.io/projected/0a98bca9-38d3-4382-a6d6-8410170f7d81-kube-api-access-pg5rd\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263512 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-default-certificate\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263528 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgzct\" (UniqueName: \"kubernetes.io/projected/9e737c04-a2db-452e-adc7-fa383e158b53-kube-api-access-wgzct\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263562 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb76779-61d8-4977-8839-083fcf6cd69b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263585 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263602 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263621 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef3a7303-57a8-461f-86c1-fd3f7882e93b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263640 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j62nk\" (UniqueName: \"kubernetes.io/projected/0386f821-c5fb-4dfd-acaf-706e214a57c0-kube-api-access-j62nk\") pod \"migrator-59844c95c7-fwkzt\" (UID: \"0386f821-c5fb-4dfd-acaf-706e214a57c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263681 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef3a7303-57a8-461f-86c1-fd3f7882e93b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263697 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rclzn\" (UniqueName: \"kubernetes.io/projected/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-kube-api-access-rclzn\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263717 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-service-ca\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263733 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgh99\" (UniqueName: \"kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263750 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34f93b2b-cc36-4965-992c-825bf2595e1e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263764 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkph5\" (UniqueName: \"kubernetes.io/projected/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-kube-api-access-pkph5\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263780 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-apiservice-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263795 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-trusted-ca\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263809 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4f90d894-17c6-4800-a438-737fe8619e01-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263834 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263850 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/044562bd-df74-47fa-bc8d-1c652233e9c5-serving-cert\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263864 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263878 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263897 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1306b657-0022-435d-bb72-793f1c1a106b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263915 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e41d768-3ed4-4760-a0d5-4308d7b13379-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263936 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq52v\" (UniqueName: \"kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v\") pod \"auto-csr-approver-29560330-44pts\" (UID: \"55e76e8f-7d69-4f55-81f8-45c9c612876b\") " pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263956 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263974 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-dir\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263989 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdmv5\" (UniqueName: \"kubernetes.io/projected/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-kube-api-access-xdmv5\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264005 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264020 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct69q\" (UniqueName: \"kubernetes.io/projected/7442ef1b-27ea-4166-8457-5332c4c8f363-kube-api-access-ct69q\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264035 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/005900fa-b395-4c1c-8e62-8e975bd0393c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcsz7\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-kube-api-access-tcsz7\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264069 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-oauth-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264083 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-stats-auth\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264108 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264125 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264143 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264159 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-srv-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264176 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkbl4\" (UniqueName: \"kubernetes.io/projected/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-kube-api-access-bkbl4\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264191 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/064b42ee-720b-456c-8ffe-a247f827befc-tmpfs\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264207 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93dadffe-0353-4301-bb97-31b034d3dc64-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264232 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264247 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264260 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb76779-61d8-4977-8839-083fcf6cd69b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.266133 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-images\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.266608 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-dir\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.267536 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-service-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268121 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-client\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268178 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5mpc\" (UniqueName: \"kubernetes.io/projected/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-kube-api-access-n5mpc\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268196 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wvzp\" (UniqueName: \"kubernetes.io/projected/34f93b2b-cc36-4965-992c-825bf2595e1e-kube-api-access-4wvzp\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268213 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-proxy-tls\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268230 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268255 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268271 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268292 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vkgk\" (UniqueName: \"kubernetes.io/projected/0ec3cdc0-f024-43cf-b520-7d2437e0f8df-kube-api-access-9vkgk\") pod \"downloads-7954f5f757-5rr7c\" (UID: \"0ec3cdc0-f024-43cf-b520-7d2437e0f8df\") " pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268307 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268327 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069c0b04-3302-488c-84fc-eeccac5fae9b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268355 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxlh\" (UniqueName: \"kubernetes.io/projected/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-kube-api-access-hxxlh\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268372 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268387 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86fb4\" (UniqueName: \"kubernetes.io/projected/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-kube-api-access-86fb4\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268407 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vhmx\" (UniqueName: \"kubernetes.io/projected/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-kube-api-access-9vhmx\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268563 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268605 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsbsw\" (UniqueName: \"kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268626 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-encryption-config\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268645 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jc6t\" (UniqueName: \"kubernetes.io/projected/1306b657-0022-435d-bb72-793f1c1a106b-kube-api-access-2jc6t\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268661 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268678 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqlb\" (UniqueName: \"kubernetes.io/projected/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-kube-api-access-xxqlb\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268693 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268707 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268727 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-oauth-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268744 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268763 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268782 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a98bca9-38d3-4382-a6d6-8410170f7d81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268798 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fskkz\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-kube-api-access-fskkz\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268817 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-config\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268845 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnq8l\" (UniqueName: \"kubernetes.io/projected/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-kube-api-access-xnq8l\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268861 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e41d768-3ed4-4760-a0d5-4308d7b13379-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.269428 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/005900fa-b395-4c1c-8e62-8e975bd0393c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.269658 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.269712 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270104 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270113 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-trusted-ca-bundle\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270210 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270378 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270864 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270953 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-service-ca\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.271016 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-encryption-config\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.271772 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-oauth-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272584 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272688 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfjfn\" (UniqueName: \"kubernetes.io/projected/4f90d894-17c6-4800-a438-737fe8619e01-kube-api-access-nfjfn\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272753 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272775 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzj4l\" (UniqueName: \"kubernetes.io/projected/044562bd-df74-47fa-bc8d-1c652233e9c5-kube-api-access-pzj4l\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272808 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272838 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a98bca9-38d3-4382-a6d6-8410170f7d81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273166 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e737c04-a2db-452e-adc7-fa383e158b53-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273228 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f93b2b-cc36-4965-992c-825bf2595e1e-proxy-tls\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273276 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-machine-approver-tls\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273354 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-serving-cert\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273386 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273407 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-webhook-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273428 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdffs\" (UniqueName: \"kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273445 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-config\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273466 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273485 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273527 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93dadffe-0353-4301-bb97-31b034d3dc64-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273592 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-config\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273626 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/005900fa-b395-4c1c-8e62-8e975bd0393c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273644 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-serving-cert\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273664 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fht74\" (UniqueName: \"kubernetes.io/projected/005900fa-b395-4c1c-8e62-8e975bd0393c-kube-api-access-fht74\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273709 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-srv-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273736 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-policies\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273782 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dadffe-0353-4301-bb97-31b034d3dc64-config\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273810 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-auth-proxy-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273826 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273845 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvtdx\" (UniqueName: \"kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273876 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwr4x\" (UniqueName: \"kubernetes.io/projected/064b42ee-720b-456c-8ffe-a247f827befc-kube-api-access-dwr4x\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273899 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e41d768-3ed4-4760-a0d5-4308d7b13379-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273982 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tj47\" (UniqueName: \"kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274000 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-service-ca-bundle\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274019 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs9wq\" (UniqueName: \"kubernetes.io/projected/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-kube-api-access-bs9wq\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273626 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-config\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274125 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-config\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274165 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274193 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb76779-61d8-4977-8839-083fcf6cd69b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274221 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274506 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/005900fa-b395-4c1c-8e62-8e975bd0393c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274847 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-policies\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274984 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.275150 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-auth-proxy-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.275293 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.275832 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l648b"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.276055 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e41d768-3ed4-4760-a0d5-4308d7b13379-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.276816 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.277065 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.277636 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29560320-s9q72"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.277884 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/044562bd-df74-47fa-bc8d-1c652233e9c5-serving-cert\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.278016 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-metrics-tls\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.278320 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-machine-approver-tls\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.278385 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.278642 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-oauth-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.280637 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-serving-cert\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.281806 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.282860 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-44pts"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.283960 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.285029 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.286070 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-trp9l"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.289762 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.289796 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mplx7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.289811 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.289903 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.290269 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1306b657-0022-435d-bb72-793f1c1a106b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.290784 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6nkm6"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.292484 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.293223 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.299609 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.302178 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.303476 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-trp9l"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.305057 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2k6jt"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.305589 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.307668 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.311534 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-npvts"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.313192 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-npvts"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.313325 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-npvts" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.316466 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.334441 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-vkr88"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.335066 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.337320 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.357905 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.375470 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.375669 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.375778 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq52v\" (UniqueName: \"kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v\") pod \"auto-csr-approver-29560330-44pts\" (UID: \"55e76e8f-7d69-4f55-81f8-45c9c612876b\") " pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.375870 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.375947 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct69q\" (UniqueName: \"kubernetes.io/projected/7442ef1b-27ea-4166-8457-5332c4c8f363-kube-api-access-ct69q\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376039 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-stats-auth\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376114 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376191 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376287 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/064b42ee-720b-456c-8ffe-a247f827befc-tmpfs\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376405 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93dadffe-0353-4301-bb97-31b034d3dc64-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376504 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-srv-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376636 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376752 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376860 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb76779-61d8-4977-8839-083fcf6cd69b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376980 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5mpc\" (UniqueName: \"kubernetes.io/projected/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-kube-api-access-n5mpc\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377065 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/064b42ee-720b-456c-8ffe-a247f827befc-tmpfs\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376987 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377018 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377445 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377567 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377576 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wvzp\" (UniqueName: \"kubernetes.io/projected/34f93b2b-cc36-4965-992c-825bf2595e1e-kube-api-access-4wvzp\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377882 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vkgk\" (UniqueName: \"kubernetes.io/projected/0ec3cdc0-f024-43cf-b520-7d2437e0f8df-kube-api-access-9vkgk\") pod \"downloads-7954f5f757-5rr7c\" (UID: \"0ec3cdc0-f024-43cf-b520-7d2437e0f8df\") " pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378012 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378130 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-proxy-tls\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378254 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378363 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069c0b04-3302-488c-84fc-eeccac5fae9b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378469 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxlh\" (UniqueName: \"kubernetes.io/projected/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-kube-api-access-hxxlh\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378710 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86fb4\" (UniqueName: \"kubernetes.io/projected/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-kube-api-access-86fb4\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379016 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379161 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsbsw\" (UniqueName: \"kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379326 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379613 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqlb\" (UniqueName: \"kubernetes.io/projected/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-kube-api-access-xxqlb\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379784 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379923 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380073 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380192 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a98bca9-38d3-4382-a6d6-8410170f7d81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380297 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fskkz\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-kube-api-access-fskkz\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380424 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfjfn\" (UniqueName: \"kubernetes.io/projected/4f90d894-17c6-4800-a438-737fe8619e01-kube-api-access-nfjfn\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380593 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a98bca9-38d3-4382-a6d6-8410170f7d81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380721 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e737c04-a2db-452e-adc7-fa383e158b53-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380836 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f93b2b-cc36-4965-992c-825bf2595e1e-proxy-tls\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380957 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-serving-cert\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381066 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381175 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-webhook-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380865 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a98bca9-38d3-4382-a6d6-8410170f7d81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381287 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdffs\" (UniqueName: \"kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381357 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-config\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381386 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381432 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93dadffe-0353-4301-bb97-31b034d3dc64-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381482 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-srv-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381500 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381580 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwr4x\" (UniqueName: \"kubernetes.io/projected/064b42ee-720b-456c-8ffe-a247f827befc-kube-api-access-dwr4x\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381605 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tj47\" (UniqueName: \"kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381623 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-service-ca-bundle\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381641 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dadffe-0353-4301-bb97-31b034d3dc64-config\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381671 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb76779-61d8-4977-8839-083fcf6cd69b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381696 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381724 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs9wq\" (UniqueName: \"kubernetes.io/projected/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-kube-api-access-bs9wq\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381757 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-images\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381793 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381811 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381837 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spn2v\" (UniqueName: \"kubernetes.io/projected/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-kube-api-access-spn2v\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-metrics-certs\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381875 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwn7z\" (UniqueName: \"kubernetes.io/projected/79ec9746-96c0-4fcd-b367-a42b6950145a-kube-api-access-wwn7z\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381913 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f90d894-17c6-4800-a438-737fe8619e01-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381939 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5rd\" (UniqueName: \"kubernetes.io/projected/0a98bca9-38d3-4382-a6d6-8410170f7d81-kube-api-access-pg5rd\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381959 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-default-certificate\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381984 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgzct\" (UniqueName: \"kubernetes.io/projected/9e737c04-a2db-452e-adc7-fa383e158b53-kube-api-access-wgzct\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382004 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb76779-61d8-4977-8839-083fcf6cd69b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382026 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382046 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef3a7303-57a8-461f-86c1-fd3f7882e93b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382063 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j62nk\" (UniqueName: \"kubernetes.io/projected/0386f821-c5fb-4dfd-acaf-706e214a57c0-kube-api-access-j62nk\") pod \"migrator-59844c95c7-fwkzt\" (UID: \"0386f821-c5fb-4dfd-acaf-706e214a57c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382207 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382346 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-config\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382936 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef3a7303-57a8-461f-86c1-fd3f7882e93b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382958 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rclzn\" (UniqueName: \"kubernetes.io/projected/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-kube-api-access-rclzn\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383040 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgh99\" (UniqueName: \"kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383119 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34f93b2b-cc36-4965-992c-825bf2595e1e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383189 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkph5\" (UniqueName: \"kubernetes.io/projected/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-kube-api-access-pkph5\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-apiservice-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383251 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-trusted-ca\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383297 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4f90d894-17c6-4800-a438-737fe8619e01-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383800 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a98bca9-38d3-4382-a6d6-8410170f7d81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383835 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4f90d894-17c6-4800-a438-737fe8619e01-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.384846 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-serving-cert\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.385513 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.385583 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34f93b2b-cc36-4965-992c-825bf2595e1e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.385729 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f90d894-17c6-4800-a438-737fe8619e01-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.389558 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-trusted-ca\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.390610 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.397406 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.421849 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.424596 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef3a7303-57a8-461f-86c1-fd3f7882e93b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.437490 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.457111 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.477128 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.488664 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef3a7303-57a8-461f-86c1-fd3f7882e93b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.497285 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.517383 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.537375 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.560010 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.578504 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.588237 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.597343 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.618007 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.637826 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.641795 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.643257 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.648504 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.656379 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.664003 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-srv-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.677126 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.698586 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.717947 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.724694 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f93b2b-cc36-4965-992c-825bf2595e1e-proxy-tls\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.737654 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.757441 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.777190 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.796927 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.804485 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93dadffe-0353-4301-bb97-31b034d3dc64-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.817842 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.825401 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dadffe-0353-4301-bb97-31b034d3dc64-config\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.836748 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.843695 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-images\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.856526 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.877540 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.880938 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-proxy-tls\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.897378 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.910254 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-srv-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.916945 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.936642 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.956431 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.967580 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-default-certificate\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.977019 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.989079 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-stats-auth\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.996412 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.007031 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-metrics-certs\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.017681 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.022705 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-service-ca-bundle\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.037083 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.056255 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.064489 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e737c04-a2db-452e-adc7-fa383e158b53-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.076862 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.082213 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.095622 4816 request.go:700] Waited for 1.013848791s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.097208 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.116933 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.122925 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.136886 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.156393 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.177811 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.181959 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.197287 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.217698 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.236922 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.257133 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.297766 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.304739 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-webhook-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.307342 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-apiservice-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.319688 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.337262 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.347903 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb76779-61d8-4977-8839-083fcf6cd69b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.357795 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376003 4816 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376113 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle podName:db720c64-b1fa-48c9-a4b7-fc42f8ca47fd nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.876083207 +0000 UTC m=+226.972383200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle") pod "service-ca-9c57cc56f-6nkm6" (UID: "db720c64-b1fa-48c9-a4b7-fc42f8ca47fd") : failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376268 4816 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376290 4816 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376350 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert podName:79ec9746-96c0-4fcd-b367-a42b6950145a nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.876324704 +0000 UTC m=+226.972624667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert") pod "service-ca-operator-777779d784-vvdz2" (UID: "79ec9746-96c0-4fcd-b367-a42b6950145a") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376369 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert podName:069c0b04-3302-488c-84fc-eeccac5fae9b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.876360325 +0000 UTC m=+226.972660278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert") pod "kube-controller-manager-operator-78b949d7b-fk9l7" (UID: "069c0b04-3302-488c-84fc-eeccac5fae9b") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.378728 4816 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.378796 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config podName:069c0b04-3302-488c-84fc-eeccac5fae9b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.878777601 +0000 UTC m=+226.975077554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config") pod "kube-controller-manager-operator-78b949d7b-fk9l7" (UID: "069c0b04-3302-488c-84fc-eeccac5fae9b") : failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.379779 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.379811 4816 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.379914 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key podName:db720c64-b1fa-48c9-a4b7-fc42f8ca47fd nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.879887721 +0000 UTC m=+226.976187704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key") pod "service-ca-9c57cc56f-6nkm6" (UID: "db720c64-b1fa-48c9-a4b7-fc42f8ca47fd") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.382196 4816 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.382256 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca podName:02854230-6165-4f22-8780-d8591b991132 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.882242785 +0000 UTC m=+226.978542748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca") pod "marketplace-operator-79b997595-8226q" (UID: "02854230-6165-4f22-8780-d8591b991132") : failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.382731 4816 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.382807 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics podName:02854230-6165-4f22-8780-d8591b991132 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.8827914 +0000 UTC m=+226.979091413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics") pod "marketplace-operator-79b997595-8226q" (UID: "02854230-6165-4f22-8780-d8591b991132") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.384096 4816 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.384113 4816 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.384171 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs podName:7442ef1b-27ea-4166-8457-5332c4c8f363 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.884154078 +0000 UTC m=+226.980454071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs") pod "multus-admission-controller-857f4d67dd-mplx7" (UID: "7442ef1b-27ea-4166-8457-5332c4c8f363") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.384211 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config podName:79ec9746-96c0-4fcd-b367-a42b6950145a nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.884184658 +0000 UTC m=+226.980484682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config") pod "service-ca-operator-777779d784-vvdz2" (UID: "79ec9746-96c0-4fcd-b367-a42b6950145a") : failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.389428 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb76779-61d8-4977-8839-083fcf6cd69b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.397257 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.417915 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.438898 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.467126 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.477544 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.497997 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.517624 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.536753 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.556768 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.578899 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.597506 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.617472 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.637204 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.658067 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.677620 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.696872 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.730823 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgff\" (UniqueName: \"kubernetes.io/projected/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-kube-api-access-vxgff\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.737668 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.757259 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.777292 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.798692 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.819349 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.836964 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.857770 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.893078 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r9tr\" (UniqueName: \"kubernetes.io/projected/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-kube-api-access-9r9tr\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.898284 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.904903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905014 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905071 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905140 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905162 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905223 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905417 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905452 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.906107 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.906236 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.906790 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.906867 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.908286 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.908371 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.908766 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.908905 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.909242 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.916613 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.936946 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.948647 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.957009 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.977564 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.012484 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zhnp\" (UniqueName: \"kubernetes.io/projected/dced2102-9fd0-4300-9e0a-35d915f1caad-kube-api-access-2zhnp\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.038506 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkbl4\" (UniqueName: \"kubernetes.io/projected/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-kube-api-access-bkbl4\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.053913 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.056674 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcsz7\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-kube-api-access-tcsz7\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.081101 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdmv5\" (UniqueName: \"kubernetes.io/projected/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-kube-api-access-xdmv5\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.090942 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2l7nk"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.095330 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vhmx\" (UniqueName: \"kubernetes.io/projected/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-kube-api-access-9vhmx\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.110875 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jc6t\" (UniqueName: \"kubernetes.io/projected/1306b657-0022-435d-bb72-793f1c1a106b-kube-api-access-2jc6t\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.112099 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pdm8d"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.115752 4816 request.go:700] Waited for 1.84377791s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Mar 16 00:10:34 crc kubenswrapper[4816]: W0316 00:10:34.120468 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f26ea52_1f97_4d4a_98bd_897c5b3b88c5.slice/crio-227d11413d29bca1a6b2536f1ef23d7f212990e7b63c387b2647895501abe9e2 WatchSource:0}: Error finding container 227d11413d29bca1a6b2536f1ef23d7f212990e7b63c387b2647895501abe9e2: Status 404 returned error can't find the container with id 227d11413d29bca1a6b2536f1ef23d7f212990e7b63c387b2647895501abe9e2 Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.134226 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnq8l\" (UniqueName: \"kubernetes.io/projected/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-kube-api-access-xnq8l\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.151545 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.171316 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.171915 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvtdx\" (UniqueName: \"kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.191564 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fht74\" (UniqueName: \"kubernetes.io/projected/005900fa-b395-4c1c-8e62-8e975bd0393c-kube-api-access-fht74\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.212178 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzj4l\" (UniqueName: \"kubernetes.io/projected/044562bd-df74-47fa-bc8d-1c652233e9c5-kube-api-access-pzj4l\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.216808 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.236662 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.236867 4816 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.257562 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.258595 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.276391 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.297027 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.317079 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.326826 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.336910 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.345205 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:34 crc kubenswrapper[4816]: W0316 00:10:34.345757 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1674a73_a65c_4a8d_9dc5_af576a7af7d4.slice/crio-156c21f74efb92d4cd61293079b23b80bcd12503e81ebb0027270843d47d9d60 WatchSource:0}: Error finding container 156c21f74efb92d4cd61293079b23b80bcd12503e81ebb0027270843d47d9d60: Status 404 returned error can't find the container with id 156c21f74efb92d4cd61293079b23b80bcd12503e81ebb0027270843d47d9d60 Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.358001 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.359290 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.378062 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.380748 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9xv4p"] Mar 16 00:10:34 crc kubenswrapper[4816]: W0316 00:10:34.393427 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1306b657_0022_435d_bb72_793f1c1a106b.slice/crio-64b816ad085833cd80a47777fbb505eb9e3f5c4e4e78be37220a15c940992742 WatchSource:0}: Error finding container 64b816ad085833cd80a47777fbb505eb9e3f5c4e4e78be37220a15c940992742: Status 404 returned error can't find the container with id 64b816ad085833cd80a47777fbb505eb9e3f5c4e4e78be37220a15c940992742 Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.402630 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.412430 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.416817 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq52v\" (UniqueName: \"kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v\") pod \"auto-csr-approver-29560330-44pts\" (UID: \"55e76e8f-7d69-4f55-81f8-45c9c612876b\") " pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.418402 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.452493 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93dadffe-0353-4301-bb97-31b034d3dc64-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.457016 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct69q\" (UniqueName: \"kubernetes.io/projected/7442ef1b-27ea-4166-8457-5332c4c8f363-kube-api-access-ct69q\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.460938 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.479070 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5mpc\" (UniqueName: \"kubernetes.io/projected/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-kube-api-access-n5mpc\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.483039 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.498153 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wvzp\" (UniqueName: \"kubernetes.io/projected/34f93b2b-cc36-4965-992c-825bf2595e1e-kube-api-access-4wvzp\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.513856 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.521289 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.522087 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vkgk\" (UniqueName: \"kubernetes.io/projected/0ec3cdc0-f024-43cf-b520-7d2437e0f8df-kube-api-access-9vkgk\") pod \"downloads-7954f5f757-5rr7c\" (UID: \"0ec3cdc0-f024-43cf-b520-7d2437e0f8df\") " pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.550169 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.556310 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxxlh\" (UniqueName: \"kubernetes.io/projected/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-kube-api-access-hxxlh\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.561036 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069c0b04-3302-488c-84fc-eeccac5fae9b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.579020 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86fb4\" (UniqueName: \"kubernetes.io/projected/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-kube-api-access-86fb4\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.592877 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fnmb9"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.604655 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.607910 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsbsw\" (UniqueName: \"kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.620022 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.626435 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqlb\" (UniqueName: \"kubernetes.io/projected/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-kube-api-access-xxqlb\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.630538 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.632942 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.646164 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.659690 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fskkz\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-kube-api-access-fskkz\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.681256 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfjfn\" (UniqueName: \"kubernetes.io/projected/4f90d894-17c6-4800-a438-737fe8619e01-kube-api-access-nfjfn\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.694800 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwr4x\" (UniqueName: \"kubernetes.io/projected/064b42ee-720b-456c-8ffe-a247f827befc-kube-api-access-dwr4x\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.709932 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.721137 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tj47\" (UniqueName: \"kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.722929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.732631 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.733355 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb76779-61d8-4977-8839-083fcf6cd69b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.746852 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.749522 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" event={"ID":"dced2102-9fd0-4300-9e0a-35d915f1caad","Type":"ContainerStarted","Data":"0f46d32a82d22035ebca6b72f5d02298b879e23dc630a2d6147dc46c9ae24083"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.750724 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" event={"ID":"c1674a73-a65c-4a8d-9dc5-af576a7af7d4","Type":"ContainerStarted","Data":"156c21f74efb92d4cd61293079b23b80bcd12503e81ebb0027270843d47d9d60"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.751873 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" event={"ID":"1306b657-0022-435d-bb72-793f1c1a106b","Type":"ContainerStarted","Data":"64b816ad085833cd80a47777fbb505eb9e3f5c4e4e78be37220a15c940992742"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.752100 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.753581 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" event={"ID":"c71f28a1-a68d-41c2-a9e6-4984e2e22c74","Type":"ContainerStarted","Data":"013ba8198b1ad95322c8c4ed9ed54ee95419f7fa9530c9926a8a7542e0bd3fdb"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.753608 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" event={"ID":"c71f28a1-a68d-41c2-a9e6-4984e2e22c74","Type":"ContainerStarted","Data":"ecadbb640205a413a401e341202c47a74197065f9e52bbe132223e6f5560a08b"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.754992 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs9wq\" (UniqueName: \"kubernetes.io/projected/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-kube-api-access-bs9wq\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.755259 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" event={"ID":"171b00f7-f7cf-41b3-bffd-11ceeb9f2182","Type":"ContainerStarted","Data":"b46ca38a8fb490811ed5bb54ba044f6698f8a29b3e0912b3a995b3f104cd2fa5"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.756211 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gvk75" event={"ID":"681ca8e4-f909-4e8b-9f35-5ab8ca382e44","Type":"ContainerStarted","Data":"0f3dfd3525c221051b9ec7cabcb25168989137878f8bcc52c0c3c3226135fea5"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.757398 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f26ea52-1f97-4d4a-98bd-897c5b3b88c5" containerID="973c9812ff199e7d01a75bd951b16fc7b17c1f59dae9b6ee85591b33b9699bf5" exitCode=0 Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.757435 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" event={"ID":"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5","Type":"ContainerDied","Data":"973c9812ff199e7d01a75bd951b16fc7b17c1f59dae9b6ee85591b33b9699bf5"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.757461 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" event={"ID":"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5","Type":"ContainerStarted","Data":"227d11413d29bca1a6b2536f1ef23d7f212990e7b63c387b2647895501abe9e2"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.767038 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.772404 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5rd\" (UniqueName: \"kubernetes.io/projected/0a98bca9-38d3-4382-a6d6-8410170f7d81-kube-api-access-pg5rd\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.795368 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwn7z\" (UniqueName: \"kubernetes.io/projected/79ec9746-96c0-4fcd-b367-a42b6950145a-kube-api-access-wwn7z\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.804328 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.812896 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spn2v\" (UniqueName: \"kubernetes.io/projected/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-kube-api-access-spn2v\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.830527 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.841577 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdffs\" (UniqueName: \"kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.862950 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.865503 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.868277 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkph5\" (UniqueName: \"kubernetes.io/projected/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-kube-api-access-pkph5\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.872647 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.878353 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j62nk\" (UniqueName: \"kubernetes.io/projected/0386f821-c5fb-4dfd-acaf-706e214a57c0-kube-api-access-j62nk\") pod \"migrator-59844c95c7-fwkzt\" (UID: \"0386f821-c5fb-4dfd-acaf-706e214a57c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.886084 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.910354 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.913969 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rclzn\" (UniqueName: \"kubernetes.io/projected/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-kube-api-access-rclzn\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.915003 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.932740 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgzct\" (UniqueName: \"kubernetes.io/projected/9e737c04-a2db-452e-adc7-fa383e158b53-kube-api-access-wgzct\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.940040 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.946781 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgh99\" (UniqueName: \"kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.986972 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nnqsw"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.027909 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.027991 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028022 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028045 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028071 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r7jk\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028096 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028135 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028163 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028213 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028262 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028285 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028346 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028372 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxdqt\" (UniqueName: \"kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028396 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028419 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028446 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028469 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028496 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028531 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.028898 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.528882574 +0000 UTC m=+228.625182527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.061805 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.083954 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.109370 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.111926 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.120607 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6r96v"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.121801 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.129303 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.129533 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.629444919 +0000 UTC m=+228.725744872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.129755 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.129849 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-socket-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.131901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132037 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132103 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132237 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r7jk\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132280 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132303 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-config-volume\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132319 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-metrics-tls\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132386 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgzpq\" (UniqueName: \"kubernetes.io/projected/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-kube-api-access-kgzpq\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.133540 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.133665 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.134625 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.135540 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.135706 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.135829 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.135987 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.136026 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjls2\" (UniqueName: \"kubernetes.io/projected/63006c82-767f-4514-9d7e-5afd9bfe6e96-kube-api-access-zjls2\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.136199 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.136311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.136427 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gx5\" (UniqueName: \"kubernetes.io/projected/7759829d-d50c-4dd7-8627-040ebf8f0e40-kube-api-access-x4gx5\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.136787 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137609 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-registration-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137712 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137746 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137818 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxdqt\" (UniqueName: \"kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137848 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7759829d-d50c-4dd7-8627-040ebf8f0e40-cert\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137920 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-certs\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137972 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138000 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-node-bootstrap-token\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138234 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138620 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138631 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138792 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-csi-data-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.139357 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.139532 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.139855 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.139982 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.140127 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.140325 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.140416 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-mountpoint-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.140680 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5cmq\" (UniqueName: \"kubernetes.io/projected/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-kube-api-access-z5cmq\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.140744 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.640731388 +0000 UTC m=+228.737031341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.140778 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-plugins-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.142078 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.143682 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.144775 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.144810 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.145096 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.145784 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.145901 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.147048 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.147286 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.151370 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.151836 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.157958 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.159105 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r7jk\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.178032 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.205509 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxdqt\" (UniqueName: \"kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.226290 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.227725 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242158 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242327 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-certs\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242356 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-node-bootstrap-token\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242379 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-csi-data-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242415 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-mountpoint-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242448 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5cmq\" (UniqueName: \"kubernetes.io/projected/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-kube-api-access-z5cmq\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242476 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-plugins-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242510 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-socket-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242581 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-config-volume\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242600 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-metrics-tls\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242641 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgzpq\" (UniqueName: \"kubernetes.io/projected/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-kube-api-access-kgzpq\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242683 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjls2\" (UniqueName: \"kubernetes.io/projected/63006c82-767f-4514-9d7e-5afd9bfe6e96-kube-api-access-zjls2\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gx5\" (UniqueName: \"kubernetes.io/projected/7759829d-d50c-4dd7-8627-040ebf8f0e40-kube-api-access-x4gx5\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242741 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-registration-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242769 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7759829d-d50c-4dd7-8627-040ebf8f0e40-cert\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.243233 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-plugins-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.243348 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.743323139 +0000 UTC m=+228.839623092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.245220 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-socket-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.246086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-config-volume\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.247861 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-mountpoint-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.247951 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-csi-data-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.248050 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-registration-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.261199 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-node-bootstrap-token\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.265936 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mplx7"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.265978 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.268782 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-certs\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.273557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7759829d-d50c-4dd7-8627-040ebf8f0e40-cert\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.275882 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-metrics-tls\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.297020 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgzpq\" (UniqueName: \"kubernetes.io/projected/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-kube-api-access-kgzpq\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.315728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjls2\" (UniqueName: \"kubernetes.io/projected/63006c82-767f-4514-9d7e-5afd9bfe6e96-kube-api-access-zjls2\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.320100 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.344225 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gx5\" (UniqueName: \"kubernetes.io/projected/7759829d-d50c-4dd7-8627-040ebf8f0e40-kube-api-access-x4gx5\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.358078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.358588 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.858575805 +0000 UTC m=+228.954875758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.365397 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5cmq\" (UniqueName: \"kubernetes.io/projected/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-kube-api-access-z5cmq\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: W0316 00:10:35.435990 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod069c0b04_3302_488c_84fc_eeccac5fae9b.slice/crio-87bda69ca4cbf86419070ae0f95c798316c9febbcad86ed3ec9d714696e0e739 WatchSource:0}: Error finding container 87bda69ca4cbf86419070ae0f95c798316c9febbcad86ed3ec9d714696e0e739: Status 404 returned error can't find the container with id 87bda69ca4cbf86419070ae0f95c798316c9febbcad86ed3ec9d714696e0e739 Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.458722 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.458854 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.958832663 +0000 UTC m=+229.055132616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.459516 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.459826 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.959817439 +0000 UTC m=+229.056117392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.533119 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-44pts"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.554682 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.561623 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.562144 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.562386 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.062367109 +0000 UTC m=+229.158667062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.562455 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.562767 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.06276079 +0000 UTC m=+229.159060743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.580983 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.587773 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.668304 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.668712 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.168694463 +0000 UTC m=+229.264994416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.693985 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29560320-s9q72"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.702306 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.734442 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.771088 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.771517 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.271502659 +0000 UTC m=+229.367802602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.793852 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" event={"ID":"c1674a73-a65c-4a8d-9dc5-af576a7af7d4","Type":"ContainerStarted","Data":"7c6c500cfb28dc83d59d87853219a73cb22b2472353d3dcec62ec2379a608552"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.800567 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" event={"ID":"069c0b04-3302-488c-84fc-eeccac5fae9b","Type":"ContainerStarted","Data":"87bda69ca4cbf86419070ae0f95c798316c9febbcad86ed3ec9d714696e0e739"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.813162 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" event={"ID":"1d5466ab-a589-4f7e-ae89-2f494b10f6b1","Type":"ContainerStarted","Data":"8bd92ab2e8746013ff96fbb3362f4a912a98fe884156f1b95b8704505ab4fe1a"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.816484 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.818010 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" event={"ID":"93dadffe-0353-4301-bb97-31b034d3dc64","Type":"ContainerStarted","Data":"974b3e2b70f0e27a9b647bc8e391de77e5ca6e17e53d98fb67c7dcd79bccb304"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.826139 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nnqsw" event={"ID":"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c","Type":"ContainerStarted","Data":"a0d10d1111e7e2e77d52b5e27988eadcbe0b5acc6c5de4dfdc0a8578af7ebd49"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.827624 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" event={"ID":"044562bd-df74-47fa-bc8d-1c652233e9c5","Type":"ContainerStarted","Data":"8603120fd31cddbb1af5cb78a46694e27d3663732e21a55abf708208a435fb18"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.834246 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" event={"ID":"005900fa-b395-4c1c-8e62-8e975bd0393c","Type":"ContainerStarted","Data":"0529b06f25e6c243657457dfefc2530ec2a164ac5044d8a511ecbf34234991c8"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.848708 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5rr7c"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.850296 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" event={"ID":"1306b657-0022-435d-bb72-793f1c1a106b","Type":"ContainerStarted","Data":"064af4bec576ab72311cb9c881471458180983ba55703837df4cc30019c61214"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.852387 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" event={"ID":"34f93b2b-cc36-4965-992c-825bf2595e1e","Type":"ContainerStarted","Data":"e4420a97b4bb6be4d17b075ce1b2cb37e8413a0733d1148eec85de090c917bf4"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.853462 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" event={"ID":"7442ef1b-27ea-4166-8457-5332c4c8f363","Type":"ContainerStarted","Data":"73c6daa7fcb97ac560caea2fc845a1867ae5d14edd93f8da77d854675d386c28"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.864617 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-44pts" event={"ID":"55e76e8f-7d69-4f55-81f8-45c9c612876b","Type":"ContainerStarted","Data":"14379482594ebf801c25583d0aab03c78f3555265f22f25f8cbeb498177ecef2"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.866830 4816 generic.go:334] "Generic (PLEG): container finished" podID="dced2102-9fd0-4300-9e0a-35d915f1caad" containerID="fe139286db47fd751017f6a262d773b9fb23cd2d2ec4d8ec9a6e456cf554930a" exitCode=0 Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.866974 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" event={"ID":"dced2102-9fd0-4300-9e0a-35d915f1caad","Type":"ContainerDied","Data":"fe139286db47fd751017f6a262d773b9fb23cd2d2ec4d8ec9a6e456cf554930a"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.868639 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" event={"ID":"b98dcc1e-4c4b-47eb-9ddf-59a138f94247","Type":"ContainerStarted","Data":"eeb77bec78a64a33e16b0e2cefa3b8ccabc03a84fa07e1289532500e2736e77c"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.873053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" event={"ID":"5e41d768-3ed4-4760-a0d5-4308d7b13379","Type":"ContainerStarted","Data":"5da8a9034d9756144255cb527badb2e6f547c6ea514e0a549acc6e784382f799"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.873509 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.873846 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.373832323 +0000 UTC m=+229.470132276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.884831 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q9xc9"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.976304 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.977411 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.477396071 +0000 UTC m=+229.573696024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: W0316 00:10:36.037576 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f90d894_17c6_4800_a438_737fe8619e01.slice/crio-1b400492d5ff6227c838ac428fcdaf391667d35c7a51faf19494d2104b095289 WatchSource:0}: Error finding container 1b400492d5ff6227c838ac428fcdaf391667d35c7a51faf19494d2104b095289: Status 404 returned error can't find the container with id 1b400492d5ff6227c838ac428fcdaf391667d35c7a51faf19494d2104b095289 Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.077276 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.077771 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.577751941 +0000 UTC m=+229.674051894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.179334 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.179641 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.679625342 +0000 UTC m=+229.775925295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.283693 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.284327 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.784313071 +0000 UTC m=+229.880613024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.290890 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm"] Mar 16 00:10:36 crc kubenswrapper[4816]: W0316 00:10:36.317026 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4bdfe91_48ef_4a44_99ba_d7ab90df9ec0.slice/crio-e7512c0843d9774277717bee67fedbfc19a4fbf662ac7ce19fc4a721573a25b6 WatchSource:0}: Error finding container e7512c0843d9774277717bee67fedbfc19a4fbf662ac7ce19fc4a721573a25b6: Status 404 returned error can't find the container with id e7512c0843d9774277717bee67fedbfc19a4fbf662ac7ce19fc4a721573a25b6 Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.354837 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.358926 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.372862 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l648b"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.375371 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.385404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.385776 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.88576155 +0000 UTC m=+229.982061503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.386288 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.486796 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.486976 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.986950343 +0000 UTC m=+230.083250296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.487523 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.487886 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.987870248 +0000 UTC m=+230.084170201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: W0316 00:10:36.515181 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod064b42ee_720b_456c_8ffe_a247f827befc.slice/crio-435f4c94388020c089f5ba1a78632c86095286e9141f52726e93fc0a6363522c WatchSource:0}: Error finding container 435f4c94388020c089f5ba1a78632c86095286e9141f52726e93fc0a6363522c: Status 404 returned error can't find the container with id 435f4c94388020c089f5ba1a78632c86095286e9141f52726e93fc0a6363522c Mar 16 00:10:36 crc kubenswrapper[4816]: W0316 00:10:36.516178 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29bcfe72_6ef1_4087_9feb_787fdba3d2d7.slice/crio-3c7d92de283b16898a891bb4b37b3cd54bf360ea0cbf78310da58d80dac3aa50 WatchSource:0}: Error finding container 3c7d92de283b16898a891bb4b37b3cd54bf360ea0cbf78310da58d80dac3aa50: Status 404 returned error can't find the container with id 3c7d92de283b16898a891bb4b37b3cd54bf360ea0cbf78310da58d80dac3aa50 Mar 16 00:10:36 crc kubenswrapper[4816]: W0316 00:10:36.537035 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ec9746_96c0_4fcd_b367_a42b6950145a.slice/crio-d3c8b5ab4cfd90d2004818a74572fcd44139f28307a477559adc275564959dd6 WatchSource:0}: Error finding container d3c8b5ab4cfd90d2004818a74572fcd44139f28307a477559adc275564959dd6: Status 404 returned error can't find the container with id d3c8b5ab4cfd90d2004818a74572fcd44139f28307a477559adc275564959dd6 Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.588975 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.589051 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.08902991 +0000 UTC m=+230.185329863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.590679 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.592302 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.092268929 +0000 UTC m=+230.188568882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.695540 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.698233 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.198204661 +0000 UTC m=+230.294504614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.698323 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.701943 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.703097 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.203080994 +0000 UTC m=+230.299380947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.733644 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.740895 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.747315 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6nkm6"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.750697 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.756105 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.804265 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.804583 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.304534154 +0000 UTC m=+230.400834107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.828476 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.834288 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" podStartSLOduration=178.834265196 podStartE2EDuration="2m58.834265196s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:36.831738507 +0000 UTC m=+229.928038460" watchObservedRunningTime="2026-03-16 00:10:36.834265196 +0000 UTC m=+229.930565149" Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.890345 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" event={"ID":"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0","Type":"ContainerStarted","Data":"e7512c0843d9774277717bee67fedbfc19a4fbf662ac7ce19fc4a721573a25b6"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.895996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" event={"ID":"34f93b2b-cc36-4965-992c-825bf2595e1e","Type":"ContainerStarted","Data":"0102ebae46c1fc8e99ff0cb4e50c57fe46507b471478556f3430169c1de5acb0"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.897801 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" event={"ID":"5e41d768-3ed4-4760-a0d5-4308d7b13379","Type":"ContainerStarted","Data":"ccf54344c9cf242f18daa95790d42c562fbe2956ec54e050d405db668a935bf5"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.902862 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" event={"ID":"c1674a73-a65c-4a8d-9dc5-af576a7af7d4","Type":"ContainerStarted","Data":"2034ace62ae33c1ff5aa46586892fcaa5167fe1712a3333f7b2e270628a77021"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.906099 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.906194 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.906525 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.406511498 +0000 UTC m=+230.502811461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.908947 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" event={"ID":"171b00f7-f7cf-41b3-bffd-11ceeb9f2182","Type":"ContainerStarted","Data":"74ca3ab68b5db7b7155d0d401ca028d9f990f240b0723da3ddb90c76801efeca"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.910066 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nnqsw" event={"ID":"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c","Type":"ContainerStarted","Data":"b3a7ebd80e41e65628b937f27a9a6d0026e0665dfdb1a3dff17f4765a374b5cb"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.914839 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" event={"ID":"9397185e-a9e3-4ef4-b0be-d9dc9208adff","Type":"ContainerStarted","Data":"b19b5574ead1cf818c519a7ffb8ef773b81e380296fd94d88cb6d44a3be77066"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.916576 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" event={"ID":"ef3a7303-57a8-461f-86c1-fd3f7882e93b","Type":"ContainerStarted","Data":"3d4fc8466c3535a388e50e076cc43357b3c1637c0fa69064b15bbb7d8d979495"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.922204 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" event={"ID":"93dadffe-0353-4301-bb97-31b034d3dc64","Type":"ContainerStarted","Data":"165ac52589fbf987de83ac85b0f5daf2f38695714d76c0365a37f99757d92693"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.925918 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" event={"ID":"59c840a8-f288-44ed-83d3-34d47041c6c6","Type":"ContainerStarted","Data":"c46a7076608f889c8e30b77b33715ed49c92e64799e40fe88b9f99f6e980f6a5"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.925966 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" event={"ID":"59c840a8-f288-44ed-83d3-34d47041c6c6","Type":"ContainerStarted","Data":"360f090f6a27a9d9ebb782602e54104c845a3d5e91127b115ef7d468e384ebfe"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.927043 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.927935 4816 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tv2n7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.928175 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.928217 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" event={"ID":"0a98bca9-38d3-4382-a6d6-8410170f7d81","Type":"ContainerStarted","Data":"459a22c156ab861c35576c1fb3628cbc341c3b5347eb2308c3393d558f2a1ab7"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.936290 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" event={"ID":"1306b657-0022-435d-bb72-793f1c1a106b","Type":"ContainerStarted","Data":"b3b4ee5b25b557320498976e779daeaaad18e40f9b7a18ae9011a6748fba579b"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.944437 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" event={"ID":"044562bd-df74-47fa-bc8d-1c652233e9c5","Type":"ContainerStarted","Data":"feb3e115aaa958caf7f3f1e53cfdcc9b39f697220b3b5b82172c36b393279bde"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.945787 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2k6jt"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.947717 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gvk75" event={"ID":"681ca8e4-f909-4e8b-9f35-5ab8ca382e44","Type":"ContainerStarted","Data":"6af71f7281740ba22cc030b932f6be0867123264435c44d5f5ba540286f918d6"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.963862 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" event={"ID":"069c0b04-3302-488c-84fc-eeccac5fae9b","Type":"ContainerStarted","Data":"9f73049d4c6407e7f4bf7d9a09acf185cda5e22663da389d70890266ca82af24"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.972603 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" event={"ID":"79ec9746-96c0-4fcd-b367-a42b6950145a","Type":"ContainerStarted","Data":"d3c8b5ab4cfd90d2004818a74572fcd44139f28307a477559adc275564959dd6"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.980880 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" event={"ID":"064b42ee-720b-456c-8ffe-a247f827befc","Type":"ContainerStarted","Data":"435f4c94388020c089f5ba1a78632c86095286e9141f52726e93fc0a6363522c"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.981808 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5rr7c" event={"ID":"0ec3cdc0-f024-43cf-b520-7d2437e0f8df","Type":"ContainerStarted","Data":"38ef432004d9ccef99538782b144f5189443b51f7377a192547af787185ab274"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.982469 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" event={"ID":"4f90d894-17c6-4800-a438-737fe8619e01","Type":"ContainerStarted","Data":"1b400492d5ff6227c838ac428fcdaf391667d35c7a51faf19494d2104b095289"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.983878 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.986239 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vkr88" event={"ID":"63006c82-767f-4514-9d7e-5afd9bfe6e96","Type":"ContainerStarted","Data":"defc7bad5c4bbc8a32133730e52ae3ffb20ab00ebd6954c4c5771830720b2d0c"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.986280 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vkr88" event={"ID":"63006c82-767f-4514-9d7e-5afd9bfe6e96","Type":"ContainerStarted","Data":"77a12cf7d9180afb497655c7adf9f2eba68426ea31f62166c80397d342b69cff"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.990727 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-s9q72" event={"ID":"9fc59286-0388-4519-afc7-f2c8cf80ab40","Type":"ContainerStarted","Data":"a30805e487fac9e751dab1510445d1b512d8b7784f8e73df1f67f72887178e24"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.990768 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-s9q72" event={"ID":"9fc59286-0388-4519-afc7-f2c8cf80ab40","Type":"ContainerStarted","Data":"469ef439f1bc4e49165115c6fecd0f6feec675c1f680294bca4301ee3520daee"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.993952 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.994625 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" event={"ID":"005900fa-b395-4c1c-8e62-8e975bd0393c","Type":"ContainerStarted","Data":"aa45c97bed9092558ce91ff7b080b67a6b2b1b3a899958cb8b70ee721ff99937"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.996279 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" event={"ID":"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e","Type":"ContainerStarted","Data":"28227b88f3697e660f1b439e8a912712575523aba2448c2ce551decede39b872"} Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.001898 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" event={"ID":"29bcfe72-6ef1-4087-9feb-787fdba3d2d7","Type":"ContainerStarted","Data":"3c7d92de283b16898a891bb4b37b3cd54bf360ea0cbf78310da58d80dac3aa50"} Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.003048 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" event={"ID":"1d5466ab-a589-4f7e-ae89-2f494b10f6b1","Type":"ContainerStarted","Data":"e90fdfac87f05e45b64d63ce5cb4d5902fbd18d9c1d580577069351527db0c29"} Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.003400 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.005467 4816 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-d9j8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.005492 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.005532 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" podStartSLOduration=179.005522821 podStartE2EDuration="2m59.005522821s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.003071454 +0000 UTC m=+230.099371407" watchObservedRunningTime="2026-03-16 00:10:37.005522821 +0000 UTC m=+230.101822774" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.006345 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" event={"ID":"b98dcc1e-4c4b-47eb-9ddf-59a138f94247","Type":"ContainerStarted","Data":"3649dcfc95a4cd8b92968a789c1a118f74cc3c2308abfaa832728996a26a704c"} Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.006702 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.007170 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.507114504 +0000 UTC m=+230.603414527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.008605 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.012719 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.512703297 +0000 UTC m=+230.609003360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.019426 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-npvts"] Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.046578 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" podStartSLOduration=179.046543651 podStartE2EDuration="2m59.046543651s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.045537853 +0000 UTC m=+230.141837806" watchObservedRunningTime="2026-03-16 00:10:37.046543651 +0000 UTC m=+230.142843604" Mar 16 00:10:37 crc kubenswrapper[4816]: W0316 00:10:37.070066 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1bc4b9a_e741_4f63_81e8_fdce3da0a5ea.slice/crio-2a2d73448d9f3e9c877e90218a56aa472e79752e5da512ecfe9f6dffb3aad02f WatchSource:0}: Error finding container 2a2d73448d9f3e9c877e90218a56aa472e79752e5da512ecfe9f6dffb3aad02f: Status 404 returned error can't find the container with id 2a2d73448d9f3e9c877e90218a56aa472e79752e5da512ecfe9f6dffb3aad02f Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.108130 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-trp9l"] Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.108687 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podStartSLOduration=179.108669637 podStartE2EDuration="2m59.108669637s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.097665317 +0000 UTC m=+230.193965270" watchObservedRunningTime="2026-03-16 00:10:37.108669637 +0000 UTC m=+230.204969590" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.110101 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.110170 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.610154618 +0000 UTC m=+230.706454571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.117188 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.118382 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.618363132 +0000 UTC m=+230.714663085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.147067 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" podStartSLOduration=180.147045305 podStartE2EDuration="3m0.147045305s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.122259218 +0000 UTC m=+230.218559171" watchObservedRunningTime="2026-03-16 00:10:37.147045305 +0000 UTC m=+230.243345258" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.152987 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" podStartSLOduration=179.152970267 podStartE2EDuration="2m59.152970267s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.152530045 +0000 UTC m=+230.248829998" watchObservedRunningTime="2026-03-16 00:10:37.152970267 +0000 UTC m=+230.249270220" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.202312 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" podStartSLOduration=180.202291603 podStartE2EDuration="3m0.202291603s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.189851194 +0000 UTC m=+230.286151157" watchObservedRunningTime="2026-03-16 00:10:37.202291603 +0000 UTC m=+230.298591556" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.218915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.219242 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.719228806 +0000 UTC m=+230.815528759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.240273 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gvk75" podStartSLOduration=179.24025488 podStartE2EDuration="2m59.24025488s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.238031559 +0000 UTC m=+230.334331502" watchObservedRunningTime="2026-03-16 00:10:37.24025488 +0000 UTC m=+230.336554833" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.309383 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nnqsw" podStartSLOduration=179.309364457 podStartE2EDuration="2m59.309364457s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.274192946 +0000 UTC m=+230.370492909" watchObservedRunningTime="2026-03-16 00:10:37.309364457 +0000 UTC m=+230.405664410" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.310726 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" podStartSLOduration=179.310716424 podStartE2EDuration="2m59.310716424s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.309067739 +0000 UTC m=+230.405367712" watchObservedRunningTime="2026-03-16 00:10:37.310716424 +0000 UTC m=+230.407016377" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.320666 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.321007 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.820992744 +0000 UTC m=+230.917292697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.345111 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-vkr88" podStartSLOduration=5.345094262 podStartE2EDuration="5.345094262s" podCreationTimestamp="2026-03-16 00:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.344569428 +0000 UTC m=+230.440869381" watchObservedRunningTime="2026-03-16 00:10:37.345094262 +0000 UTC m=+230.441394215" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.387701 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" podStartSLOduration=180.387686165 podStartE2EDuration="3m0.387686165s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.386138453 +0000 UTC m=+230.482438436" watchObservedRunningTime="2026-03-16 00:10:37.387686165 +0000 UTC m=+230.483986138" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.421977 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.422284 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.922270279 +0000 UTC m=+231.018570232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.430495 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" podStartSLOduration=179.430477053 podStartE2EDuration="2m59.430477053s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.42960567 +0000 UTC m=+230.525905633" watchObservedRunningTime="2026-03-16 00:10:37.430477053 +0000 UTC m=+230.526777006" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.478091 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29560320-s9q72" podStartSLOduration=180.478072313 podStartE2EDuration="3m0.478072313s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.476169551 +0000 UTC m=+230.572469504" watchObservedRunningTime="2026-03-16 00:10:37.478072313 +0000 UTC m=+230.574372266" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.524444 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.524819 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.024799669 +0000 UTC m=+231.121099682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.553025 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.553575 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.553604 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.628180 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.628353 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.128327485 +0000 UTC m=+231.224627438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.628498 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.628945 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.128936192 +0000 UTC m=+231.225236145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.729144 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.729369 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.229342943 +0000 UTC m=+231.325642906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.729818 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.730135 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.230122695 +0000 UTC m=+231.326422648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.830569 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.830944 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.330924257 +0000 UTC m=+231.427224210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.931620 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.932105 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.432090609 +0000 UTC m=+231.528390562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.013497 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" event={"ID":"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd","Type":"ContainerStarted","Data":"26f97e4c130f331c2d50ce2b937d50024b4efd11592e9eb2859a5e9894578260"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.013597 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" event={"ID":"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd","Type":"ContainerStarted","Data":"a62d9dc9b5386cd41a4ae67f3beb9c6b4df9fc260cfe573952486ec5c5bc7a2e"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.017055 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" event={"ID":"7442ef1b-27ea-4166-8457-5332c4c8f363","Type":"ContainerStarted","Data":"4c1bf1d323eb433075e918aa55b3dfbdbcb746a1bc363fcfca0b609ae1dff650"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.027443 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" event={"ID":"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2","Type":"ContainerStarted","Data":"dd4d838880c6bd7c297db64a0c1cea2342563e729a46d32282ed954ca2fecaad"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.032393 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.032769 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.532751097 +0000 UTC m=+231.629051050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.034177 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" event={"ID":"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea","Type":"ContainerStarted","Data":"166c68c117b0d047b57a181137d2c0acfbfb9d239261c31c810f6e12108921d9"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.034216 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" event={"ID":"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea","Type":"ContainerStarted","Data":"2a2d73448d9f3e9c877e90218a56aa472e79752e5da512ecfe9f6dffb3aad02f"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.038214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" event={"ID":"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0","Type":"ContainerStarted","Data":"aa168efabbf9d392d454d37cbc266812b2313552f614751fa78eb0e89f861f35"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.040011 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" event={"ID":"4f90d894-17c6-4800-a438-737fe8619e01","Type":"ContainerStarted","Data":"d9558b98ac0b8301b1e2fd81ab83d4eaebf891ae7f77f266a39b5bc52e74f754"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.047693 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" event={"ID":"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5","Type":"ContainerStarted","Data":"eedfb299c62b00dbca7e4f4925bca71acaf7c649798cb665bcff51f84f3f22cb"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.053242 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" event={"ID":"dced2102-9fd0-4300-9e0a-35d915f1caad","Type":"ContainerStarted","Data":"8a26f5496e8e0c593d07dd4aa2ec276b40aeb18ed7abdb0a1d52fdca818b2d58"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.059761 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" event={"ID":"29bcfe72-6ef1-4087-9feb-787fdba3d2d7","Type":"ContainerStarted","Data":"04c28cf56e5a344151bb2ad50a201b5cc3a6cf82f7e0bf250c9d9e5100b6245e"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.060065 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.062822 4816 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-gtwmf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.062881 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" podUID="29bcfe72-6ef1-4087-9feb-787fdba3d2d7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.081202 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" event={"ID":"b98dcc1e-4c4b-47eb-9ddf-59a138f94247","Type":"ContainerStarted","Data":"02e89da47525606b5bdfb001ed5ff1164102abf4c6983f572471ec01e458a74c"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.085294 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" event={"ID":"9e737c04-a2db-452e-adc7-fa383e158b53","Type":"ContainerStarted","Data":"b073d0d4c2c60bff6d7b6f5d036b426a16e798c5b984e87897f11efb64f9289a"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.085451 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" event={"ID":"9e737c04-a2db-452e-adc7-fa383e158b53","Type":"ContainerStarted","Data":"92575e8e31aebcdde122ac5fae631766325c1f58e9a7f02918e7f2265ec057c1"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.087600 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" event={"ID":"064b42ee-720b-456c-8ffe-a247f827befc","Type":"ContainerStarted","Data":"73fbecdf6a7fd7c801af9284efb8b05891d8738c1930c8fb1def974987b2f95f"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.087861 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.089073 4816 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zn6w7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.089118 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" podUID="064b42ee-720b-456c-8ffe-a247f827befc" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.089576 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" event={"ID":"79ec9746-96c0-4fcd-b367-a42b6950145a","Type":"ContainerStarted","Data":"40a419244d9710129281429e56957e263d3d8536b057f748b088c553d5e63bc0"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.090997 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" event={"ID":"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e","Type":"ContainerStarted","Data":"1112e628b675b9cd7bfcdc017c14b275f18d39ad1cbe487a0c459256c84f03de"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.091439 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.092481 4816 patch_prober.go:28] interesting pod/console-operator-58897d9998-q9xc9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.092519 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" podUID="4cc341f9-55c7-4bce-a0e3-24df68ca7f0e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.093152 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" event={"ID":"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9","Type":"ContainerStarted","Data":"0cb716ff167593f0ce0b7cf0c62e4a01bf26679766bfd30f596995ff18105521"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.093183 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" event={"ID":"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9","Type":"ContainerStarted","Data":"99594b5194aa148721f790813af4c45ce7463df5d5d273f411a500c85e93558e"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.095849 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2k6jt" event={"ID":"7759829d-d50c-4dd7-8627-040ebf8f0e40","Type":"ContainerStarted","Data":"ed6369d343eaa6ced138f1e08f8920e50b7e75f80249eeed063194ae330c409c"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.096996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" event={"ID":"9397185e-a9e3-4ef4-b0be-d9dc9208adff","Type":"ContainerStarted","Data":"a26a0a4314d400c57bc06c18480ab7a501ebc981f4b8dbd60334dd3390aec49c"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.098964 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-npvts" event={"ID":"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847","Type":"ContainerStarted","Data":"26cce752542010b9b78a872e41deb471ebd65a5cf9e7b8dce46f0460be533d3d"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.100335 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" event={"ID":"ef3a7303-57a8-461f-86c1-fd3f7882e93b","Type":"ContainerStarted","Data":"d8a2b78be01e0a3cad62bf791050eb4476b6f7635f0071cf9f563caf64a00f35"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.106831 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" event={"ID":"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a","Type":"ContainerStarted","Data":"4c5f5a9190c5fd7efd62406f64a4bc30706c3b659d00a057ad00f198d807f0fa"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.106885 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" event={"ID":"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a","Type":"ContainerStarted","Data":"115b6339d9deaae8dfb122833be5a7a879426f2d399688991e46b804d72884b0"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.125859 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" event={"ID":"0a98bca9-38d3-4382-a6d6-8410170f7d81","Type":"ContainerStarted","Data":"2fb8f1515730058de3be4b92067511b1f91b5ec7b522a8c73925e59f777d1b2a"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.130497 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" event={"ID":"0386f821-c5fb-4dfd-acaf-706e214a57c0","Type":"ContainerStarted","Data":"1f81d667b086a11aa202533810c522896c89e2ea6f7fb8a20b1c273c15855d63"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.130630 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" event={"ID":"0386f821-c5fb-4dfd-acaf-706e214a57c0","Type":"ContainerStarted","Data":"ff8e4bc0be4db6495a13d89a5084830a3005f00b79e93f30e8bb0988f12a09c0"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.134137 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.135169 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5rr7c" event={"ID":"0ec3cdc0-f024-43cf-b520-7d2437e0f8df","Type":"ContainerStarted","Data":"9e1758947a169fa8c89c8e3873ca56d930c8ca55c7c143100afca371ccc218fc"} Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.137020 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.636974583 +0000 UTC m=+231.733274646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.140078 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.141501 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.141589 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.146381 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" event={"ID":"171b00f7-f7cf-41b3-bffd-11ceeb9f2182","Type":"ContainerStarted","Data":"024a1d4e9bfc2df97c747faa2472891f385dde870e0ae45dedd8fdf097d27c60"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.153832 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" event={"ID":"34f93b2b-cc36-4965-992c-825bf2595e1e","Type":"ContainerStarted","Data":"7b60a65838b86ad7eb2fb071b4ccae28841d696a876e23f060b33e63b90b13a3"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.160158 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" event={"ID":"3eb76779-61d8-4977-8839-083fcf6cd69b","Type":"ContainerStarted","Data":"a5ed6e71a594c27998c394e3bcbaac99853dd55430986ddf8a6c0c49260b7970"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.161984 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" event={"ID":"7c3e347f-464a-43f1-bf29-689bf81a28e6","Type":"ContainerStarted","Data":"3631ced358fcea8ef22224f7b1a8e3a7674d52e4a7296b38cf119840b4577b45"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.165781 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" event={"ID":"02854230-6165-4f22-8780-d8591b991132","Type":"ContainerStarted","Data":"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.165821 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" event={"ID":"02854230-6165-4f22-8780-d8591b991132","Type":"ContainerStarted","Data":"fbc545a6e69e36c7e153d8947909848cfdb5be666c80ed949869b9fabb25d45a"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.166446 4816 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-d9j8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.166515 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.166617 4816 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tv2n7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.166649 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.227329 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" podStartSLOduration=181.227309509 podStartE2EDuration="3m1.227309509s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.225888621 +0000 UTC m=+231.322188574" watchObservedRunningTime="2026-03-16 00:10:38.227309509 +0000 UTC m=+231.323609462" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.235697 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.235882 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.735857093 +0000 UTC m=+231.832157046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.238176 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.238419 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.738404182 +0000 UTC m=+231.834704215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.270180 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" podStartSLOduration=180.270165539 podStartE2EDuration="3m0.270165539s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.268992987 +0000 UTC m=+231.365292940" watchObservedRunningTime="2026-03-16 00:10:38.270165539 +0000 UTC m=+231.366465492" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.308768 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" podStartSLOduration=180.308747843 podStartE2EDuration="3m0.308747843s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.308359612 +0000 UTC m=+231.404659565" watchObservedRunningTime="2026-03-16 00:10:38.308747843 +0000 UTC m=+231.405047796" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.340038 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.340489 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.840475539 +0000 UTC m=+231.936775492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.350308 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" podStartSLOduration=181.350294877 podStartE2EDuration="3m1.350294877s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.349085364 +0000 UTC m=+231.445385307" watchObservedRunningTime="2026-03-16 00:10:38.350294877 +0000 UTC m=+231.446594830" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.386402 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" podStartSLOduration=180.386385173 podStartE2EDuration="3m0.386385173s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.385360995 +0000 UTC m=+231.481660948" watchObservedRunningTime="2026-03-16 00:10:38.386385173 +0000 UTC m=+231.482685126" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.441755 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.442336 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.9423239 +0000 UTC m=+232.038623843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.469015 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" podStartSLOduration=180.468997198 podStartE2EDuration="3m0.468997198s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.46687999 +0000 UTC m=+231.563179943" watchObservedRunningTime="2026-03-16 00:10:38.468997198 +0000 UTC m=+231.565297151" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.511527 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" podStartSLOduration=180.511508949 podStartE2EDuration="3m0.511508949s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.511109568 +0000 UTC m=+231.607409521" watchObservedRunningTime="2026-03-16 00:10:38.511508949 +0000 UTC m=+231.607808902" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.545817 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.546291 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.046272548 +0000 UTC m=+232.142572501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.558189 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.558623 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.599685 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" podStartSLOduration=180.599662366 podStartE2EDuration="3m0.599662366s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.591936155 +0000 UTC m=+231.688236108" watchObservedRunningTime="2026-03-16 00:10:38.599662366 +0000 UTC m=+231.695962319" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.599912 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5rr7c" podStartSLOduration=180.599907742 podStartE2EDuration="3m0.599907742s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.555296434 +0000 UTC m=+231.651596387" watchObservedRunningTime="2026-03-16 00:10:38.599907742 +0000 UTC m=+231.696207695" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.633652 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" podStartSLOduration=180.633631183 podStartE2EDuration="3m0.633631183s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.631073443 +0000 UTC m=+231.727373396" watchObservedRunningTime="2026-03-16 00:10:38.633631183 +0000 UTC m=+231.729931136" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.648865 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.649279 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.14926459 +0000 UTC m=+232.245564543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.675237 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" podStartSLOduration=180.675218779 podStartE2EDuration="3m0.675218779s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.671809846 +0000 UTC m=+231.768109799" watchObservedRunningTime="2026-03-16 00:10:38.675218779 +0000 UTC m=+231.771518732" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.750103 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.750514 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.250497294 +0000 UTC m=+232.346797247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.852407 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.852919 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.35290279 +0000 UTC m=+232.449202743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.953754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.953935 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.453910018 +0000 UTC m=+232.550209971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.954078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.954496 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.454482083 +0000 UTC m=+232.550782036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.054359 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.054418 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.054861 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.055045 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.555016718 +0000 UTC m=+232.651316671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.055203 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.055668 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.555650886 +0000 UTC m=+232.651950839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.056187 4816 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-zrq8d container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.056224 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" podUID="dced2102-9fd0-4300-9e0a-35d915f1caad" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.156329 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.156482 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.656460938 +0000 UTC m=+232.752760901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.156614 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.156950 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.656940581 +0000 UTC m=+232.753240534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.179447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2k6jt" event={"ID":"7759829d-d50c-4dd7-8627-040ebf8f0e40","Type":"ContainerStarted","Data":"95989697f1af58eeb14eed6626f25568a7d89fa8659db716371d303537096b95"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.190882 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" event={"ID":"0386f821-c5fb-4dfd-acaf-706e214a57c0","Type":"ContainerStarted","Data":"eb1247ca54e1725f92ff47a334cb3f93c7288edaf95c85bb29119f4190447728"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.198969 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" event={"ID":"3eb76779-61d8-4977-8839-083fcf6cd69b","Type":"ContainerStarted","Data":"bfb1da3e05b684a07c1c5093f75a36fc7234932ef5ea1cdcd2af912b2efbadc3"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.201790 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2k6jt" podStartSLOduration=8.201772685 podStartE2EDuration="8.201772685s" podCreationTimestamp="2026-03-16 00:10:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.198059164 +0000 UTC m=+232.294359127" watchObservedRunningTime="2026-03-16 00:10:39.201772685 +0000 UTC m=+232.298072648" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.202005 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" event={"ID":"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5","Type":"ContainerStarted","Data":"cd8239e30fa47ad0b09c897db8dede32e8baf15326f27d0d799c7d11f5bf9245"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.212784 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" event={"ID":"7442ef1b-27ea-4166-8457-5332c4c8f363","Type":"ContainerStarted","Data":"52c9358f54c69ab6c19056c97d3fd556c1de4b607e855be3201418922f2918bc"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.216788 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" event={"ID":"7c3e347f-464a-43f1-bf29-689bf81a28e6","Type":"ContainerStarted","Data":"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.217135 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.218844 4816 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-sshl5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.218881 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.225170 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" event={"ID":"ef3a7303-57a8-461f-86c1-fd3f7882e93b","Type":"ContainerStarted","Data":"848fdb51e4355de505e505755585dc2dce6c8c4c01ec5f0f58747f35b9c9b660"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.227314 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" podStartSLOduration=181.227295802 podStartE2EDuration="3m1.227295802s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.225957115 +0000 UTC m=+232.322257068" watchObservedRunningTime="2026-03-16 00:10:39.227295802 +0000 UTC m=+232.323595755" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.239935 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" event={"ID":"9e737c04-a2db-452e-adc7-fa383e158b53","Type":"ContainerStarted","Data":"d2b2e94e44481d7d62ca47549fa81cc832d88928edb0ed280b9862d6e4e1afa8"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.240837 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.244109 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" event={"ID":"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0","Type":"ContainerStarted","Data":"da9257f464c0b60fee5d912e86c720ccf6b77f86904d024f2ffea1f8dfd90424"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.246734 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" podStartSLOduration=181.246720162 podStartE2EDuration="3m1.246720162s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.245271263 +0000 UTC m=+232.341571216" watchObservedRunningTime="2026-03-16 00:10:39.246720162 +0000 UTC m=+232.343020115" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.248018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-npvts" event={"ID":"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847","Type":"ContainerStarted","Data":"78c7fcdf09f86057529d576ef181b7396649103d70b1badae9d8a06b15d9d653"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.248053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-npvts" event={"ID":"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847","Type":"ContainerStarted","Data":"c8352705d653ad742717cade5bf89d6cc224bcfeb440f69392e89c3a246c31ef"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.248540 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-npvts" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.250088 4816 generic.go:334] "Generic (PLEG): container finished" podID="4f90d894-17c6-4800-a438-737fe8619e01" containerID="d9558b98ac0b8301b1e2fd81ab83d4eaebf891ae7f77f266a39b5bc52e74f754" exitCode=0 Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.251525 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" event={"ID":"4f90d894-17c6-4800-a438-737fe8619e01","Type":"ContainerDied","Data":"d9558b98ac0b8301b1e2fd81ab83d4eaebf891ae7f77f266a39b5bc52e74f754"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.251972 4816 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zn6w7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255296 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" podUID="064b42ee-720b-456c-8ffe-a247f827befc" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.252292 4816 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-gtwmf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255412 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" podUID="29bcfe72-6ef1-4087-9feb-787fdba3d2d7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.252709 4816 patch_prober.go:28] interesting pod/console-operator-58897d9998-q9xc9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255447 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" podUID="4cc341f9-55c7-4bce-a0e3-24df68ca7f0e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.254227 4816 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tv2n7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255475 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255519 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255564 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.261384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.261528 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.761502796 +0000 UTC m=+232.857802759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.261894 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.265163 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.765145255 +0000 UTC m=+232.861445258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.285320 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" podStartSLOduration=182.285291975 podStartE2EDuration="3m2.285291975s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.28434996 +0000 UTC m=+232.380649913" watchObservedRunningTime="2026-03-16 00:10:39.285291975 +0000 UTC m=+232.381591928" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.321848 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" podStartSLOduration=182.321832823 podStartE2EDuration="3m2.321832823s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.317180536 +0000 UTC m=+232.413480489" watchObservedRunningTime="2026-03-16 00:10:39.321832823 +0000 UTC m=+232.418132776" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.342185 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" podStartSLOduration=181.342165578 podStartE2EDuration="3m1.342165578s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.340827622 +0000 UTC m=+232.437127575" watchObservedRunningTime="2026-03-16 00:10:39.342165578 +0000 UTC m=+232.438465531" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.366077 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.367576 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.867530451 +0000 UTC m=+232.963830404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.391087 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" podStartSLOduration=181.391047763 podStartE2EDuration="3m1.391047763s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.382234182 +0000 UTC m=+232.478534135" watchObservedRunningTime="2026-03-16 00:10:39.391047763 +0000 UTC m=+232.487347716" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.402398 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" podStartSLOduration=181.402379202 podStartE2EDuration="3m1.402379202s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.401693784 +0000 UTC m=+232.497993737" watchObservedRunningTime="2026-03-16 00:10:39.402379202 +0000 UTC m=+232.498679155" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.431901 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" podStartSLOduration=181.431883478 podStartE2EDuration="3m1.431883478s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.430345706 +0000 UTC m=+232.526645659" watchObservedRunningTime="2026-03-16 00:10:39.431883478 +0000 UTC m=+232.528183431" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.453375 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" podStartSLOduration=181.453351934 podStartE2EDuration="3m1.453351934s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.451645537 +0000 UTC m=+232.547945490" watchObservedRunningTime="2026-03-16 00:10:39.453351934 +0000 UTC m=+232.549651887" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.469059 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.469410 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.969397952 +0000 UTC m=+233.065697905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.521285 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" podStartSLOduration=181.521267078 podStartE2EDuration="3m1.521267078s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.494027725 +0000 UTC m=+232.590327688" watchObservedRunningTime="2026-03-16 00:10:39.521267078 +0000 UTC m=+232.617567031" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.551009 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" podStartSLOduration=181.55099199 podStartE2EDuration="3m1.55099199s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.524440055 +0000 UTC m=+232.620740008" watchObservedRunningTime="2026-03-16 00:10:39.55099199 +0000 UTC m=+232.647291943" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.560300 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:39 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:39 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:39 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.560362 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.571040 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.571451 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.071433268 +0000 UTC m=+233.167733221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.588630 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" podStartSLOduration=181.588612237 podStartE2EDuration="3m1.588612237s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.553047026 +0000 UTC m=+232.649346979" watchObservedRunningTime="2026-03-16 00:10:39.588612237 +0000 UTC m=+232.684912190" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.591064 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" podStartSLOduration=181.591054334 podStartE2EDuration="3m1.591054334s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.588178405 +0000 UTC m=+232.684478358" watchObservedRunningTime="2026-03-16 00:10:39.591054334 +0000 UTC m=+232.687354287" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.672229 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.672527 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.172515898 +0000 UTC m=+233.268815851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.737528 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54916: no serving certificate available for the kubelet" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.773055 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.773470 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.273452724 +0000 UTC m=+233.369752677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.785581 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54918: no serving certificate available for the kubelet" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.833496 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54922: no serving certificate available for the kubelet" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.875095 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.875580 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.375566332 +0000 UTC m=+233.471866285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.893090 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54930: no serving certificate available for the kubelet" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.959024 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54944: no serving certificate available for the kubelet" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.976810 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.976987 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.47695934 +0000 UTC m=+233.573259293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.977113 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.977676 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.477662519 +0000 UTC m=+233.573962472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.058324 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54956: no serving certificate available for the kubelet" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.078918 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.079149 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.579120619 +0000 UTC m=+233.675420582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.079231 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.079668 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.579656944 +0000 UTC m=+233.675956977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.180923 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.181710 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.68168809 +0000 UTC m=+233.777988043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.252676 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54964: no serving certificate available for the kubelet" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.271472 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" event={"ID":"4f90d894-17c6-4800-a438-737fe8619e01","Type":"ContainerStarted","Data":"398869ad6a60ce3b3a8c87e03134cd0a8845b94fb0f7932db64c46ad9a35842c"} Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.271539 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.274111 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" event={"ID":"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2","Type":"ContainerStarted","Data":"3c467605f4551d8c3d5668a70ac90cbcf3f0ed8dd7b0ae3d6b85b6ea1fe8119c"} Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275221 4816 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-gtwmf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275263 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" podUID="29bcfe72-6ef1-4087-9feb-787fdba3d2d7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275436 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275235 4816 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-sshl5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275636 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275631 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.282596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.282934 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.782922184 +0000 UTC m=+233.879222137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.291749 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-npvts" podStartSLOduration=8.291727524 podStartE2EDuration="8.291727524s" podCreationTimestamp="2026-03-16 00:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.649596672 +0000 UTC m=+232.745896625" watchObservedRunningTime="2026-03-16 00:10:40.291727524 +0000 UTC m=+233.388027477" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.292785 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" podStartSLOduration=183.292775353 podStartE2EDuration="3m3.292775353s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:40.288701362 +0000 UTC m=+233.385001315" watchObservedRunningTime="2026-03-16 00:10:40.292775353 +0000 UTC m=+233.389075306" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.383804 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.383947 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.883923921 +0000 UTC m=+233.980223874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.384271 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.389091 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.889074602 +0000 UTC m=+233.985374635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.485734 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.485897 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.985876094 +0000 UTC m=+234.082176047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.486335 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.486665 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.986655025 +0000 UTC m=+234.082954978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.556141 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:40 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:40 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:40 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.556213 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.587454 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.587689 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.087643013 +0000 UTC m=+234.183942976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.587951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.588308 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.08829064 +0000 UTC m=+234.184590623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.628088 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54972: no serving certificate available for the kubelet" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.688844 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.689157 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.189108543 +0000 UTC m=+234.285408496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.791414 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.791836 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.291814917 +0000 UTC m=+234.388114870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.892849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.893081 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.393034801 +0000 UTC m=+234.489334754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.893168 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.893503 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.393485323 +0000 UTC m=+234.489785276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.994090 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.996848 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.496825865 +0000 UTC m=+234.593125818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.997008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.997430 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.497417981 +0000 UTC m=+234.593717934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.098139 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.098527 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.598508081 +0000 UTC m=+234.694808034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.199276 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.199653 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.699637192 +0000 UTC m=+234.795937145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.300477 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.300666 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.800636769 +0000 UTC m=+234.896936732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.374522 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54984: no serving certificate available for the kubelet" Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.421326 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.421722 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.921707235 +0000 UTC m=+235.018007188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.522870 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.523158 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.023106564 +0000 UTC m=+235.119406527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.523598 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.524014 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.023996268 +0000 UTC m=+235.120296421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.553008 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:41 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:41 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:41 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.553071 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.624561 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.625099 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.125079908 +0000 UTC m=+235.221379881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.726822 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.727338 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.227317069 +0000 UTC m=+235.323617092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.829562 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.830015 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.329997983 +0000 UTC m=+235.426297936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.931261 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.931707 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.431687289 +0000 UTC m=+235.527987332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.032178 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.032309 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.532292056 +0000 UTC m=+235.628592009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.032428 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.032748 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.532741028 +0000 UTC m=+235.629040981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.133504 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.133804 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.633762676 +0000 UTC m=+235.730062629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.133883 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.134207 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.634190218 +0000 UTC m=+235.730490171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.236488 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.236651 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.736631695 +0000 UTC m=+235.832931648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.236724 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.237064 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.737056787 +0000 UTC m=+235.833356740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.336188 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.337004 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.337361 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.337592 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.83753621 +0000 UTC m=+235.933836173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.337661 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.337891 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.837880039 +0000 UTC m=+235.934179992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.349132 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.350040 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.354963 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.355151 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.357917 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" event={"ID":"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2","Type":"ContainerStarted","Data":"568dcbbc8f060aeb28cb18242ceb31f90f8759aece7ef7d23357f966a0c5ba20"} Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.358142 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.404329 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.406592 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461195 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461390 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45bbd\" (UniqueName: \"kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461445 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461510 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461527 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461563 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.462161 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.962143352 +0000 UTC m=+236.058443305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.466089 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.466299 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" containerID="cri-o://c46a7076608f889c8e30b77b33715ed49c92e64799e40fe88b9f99f6e980f6a5" gracePeriod=30 Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.480906 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.549368 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.549633 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" containerID="cri-o://e90fdfac87f05e45b64d63ce5cb4d5902fbd18d9c1d580577069351527db0c29" gracePeriod=30 Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.557987 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562010 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:42 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:42 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:42 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562066 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562338 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45bbd\" (UniqueName: \"kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562380 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562418 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562487 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562509 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.563017 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.063007016 +0000 UTC m=+236.159306969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.563058 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.563444 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.563492 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.612273 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.615293 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.620780 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45bbd\" (UniqueName: \"kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.621139 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.637285 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.650343 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.657894 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.665288 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.666066 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.666451 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.16643465 +0000 UTC m=+236.262734603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.666472 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.667129 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.669388 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.671893 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.672427 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.698019 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54988: no serving certificate available for the kubelet" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.751815 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.756754 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.762033 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768151 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4mjs\" (UniqueName: \"kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768260 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768306 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768336 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768360 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.768728 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.268715793 +0000 UTC m=+236.365015746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871412 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871658 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871694 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871750 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871772 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871810 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8frh\" (UniqueName: \"kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.874694 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.875114 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.877683 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4mjs\" (UniqueName: \"kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.877723 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.877741 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.878129 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.878492 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.378469149 +0000 UTC m=+236.474769102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.901534 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4mjs\" (UniqueName: \"kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.902140 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.931364 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.932395 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.946314 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.953504 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.978772 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.978835 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.978899 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8frh\" (UniqueName: \"kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.978947 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.979250 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.4792288 +0000 UTC m=+236.575528803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.979358 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.979843 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: W0316 00:10:42.991796 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod46c3b4df_48c2_4131_8ac4_ea6276d70d54.slice/crio-2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157 WatchSource:0}: Error finding container 2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157: Status 404 returned error can't find the container with id 2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157 Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.010810 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8frh\" (UniqueName: \"kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.016929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.033518 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:10:43 crc kubenswrapper[4816]: W0316 00:10:43.045262 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad80e1a9_75dc_4860_9bd9_d59b0c0ae43c.slice/crio-661437598c338aed0d5a7d52e67330434003899adaefd998268791f6175ab8ca WatchSource:0}: Error finding container 661437598c338aed0d5a7d52e67330434003899adaefd998268791f6175ab8ca: Status 404 returned error can't find the container with id 661437598c338aed0d5a7d52e67330434003899adaefd998268791f6175ab8ca Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.070542 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.074309 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.079926 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.080085 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gwq2\" (UniqueName: \"kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.080149 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.080176 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.080284 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.580271399 +0000 UTC m=+236.676571352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.086730 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.182033 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gwq2\" (UniqueName: \"kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.182452 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.182496 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.182518 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.182932 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.682919472 +0000 UTC m=+236.779219425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.229953 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.239904 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gwq2\" (UniqueName: \"kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.245788 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.271201 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.280781 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.284118 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.284488 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.784473494 +0000 UTC m=+236.880773447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.374174 4816 generic.go:334] "Generic (PLEG): container finished" podID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerID="e90fdfac87f05e45b64d63ce5cb4d5902fbd18d9c1d580577069351527db0c29" exitCode=0 Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.374220 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" event={"ID":"1d5466ab-a589-4f7e-ae89-2f494b10f6b1","Type":"ContainerDied","Data":"e90fdfac87f05e45b64d63ce5cb4d5902fbd18d9c1d580577069351527db0c29"} Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.375749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerStarted","Data":"661437598c338aed0d5a7d52e67330434003899adaefd998268791f6175ab8ca"} Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.382769 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerStarted","Data":"0e89bdbfb4ed11608191b3360966bdeb2f13767d41154d3097545518437bcaec"} Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.385068 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"46c3b4df-48c2-4131-8ac4-ea6276d70d54","Type":"ContainerStarted","Data":"2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157"} Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.385393 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.386422 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.885796261 +0000 UTC m=+236.982096214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.386753 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:10:43 crc kubenswrapper[4816]: W0316 00:10:43.396096 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff69863d_13e1_444c_ba61_6d68a509a203.slice/crio-bb49e18eedefb469d6109a46587df4a9ef4eb1e0a35954df9209a551fbb7b5b4 WatchSource:0}: Error finding container bb49e18eedefb469d6109a46587df4a9ef4eb1e0a35954df9209a551fbb7b5b4: Status 404 returned error can't find the container with id bb49e18eedefb469d6109a46587df4a9ef4eb1e0a35954df9209a551fbb7b5b4 Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.487157 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.487516 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.987490657 +0000 UTC m=+237.083790610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.487796 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.488502 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.988480874 +0000 UTC m=+237.084780827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.530785 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:10:43 crc kubenswrapper[4816]: W0316 00:10:43.538283 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf586c6e_f957_46fc_8140_f9a9ea22510f.slice/crio-e4ae76a3c7fcca7fb114ac9afc90c35f9554a0225d9ad44974098c92d5909906 WatchSource:0}: Error finding container e4ae76a3c7fcca7fb114ac9afc90c35f9554a0225d9ad44974098c92d5909906: Status 404 returned error can't find the container with id e4ae76a3c7fcca7fb114ac9afc90c35f9554a0225d9ad44974098c92d5909906 Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.556769 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:43 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:43 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:43 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.556821 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.589070 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.589530 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.089513503 +0000 UTC m=+237.185813456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.667659 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.690625 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.691131 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.191094406 +0000 UTC m=+237.287394369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: W0316 00:10:43.737877 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5c90a604_be49_44dc_b350_9df660d8587b.slice/crio-f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0 WatchSource:0}: Error finding container f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0: Status 404 returned error can't find the container with id f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0 Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.791294 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.791747 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.291727624 +0000 UTC m=+237.388027577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.907237 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.907638 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.407623188 +0000 UTC m=+237.503923161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.948925 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.948963 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.008662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.008825 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.508806111 +0000 UTC m=+237.605106064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.008981 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.009319 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.509309385 +0000 UTC m=+237.605609338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.064852 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.073612 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.110829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.112588 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.612532392 +0000 UTC m=+237.708832345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.212628 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.213163 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.713145449 +0000 UTC m=+237.809445402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.313835 4816 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pdm8d container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]log ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]etcd ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/generic-apiserver-start-informers ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/max-in-flight-filter ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 16 00:10:44 crc kubenswrapper[4816]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 16 00:10:44 crc kubenswrapper[4816]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/project.openshift.io-projectcache ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/openshift.io-startinformers ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 16 00:10:44 crc kubenswrapper[4816]: livez check failed Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.313942 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" podUID="1f26ea52-1f97-4d4a-98bd-897c5b3b88c5" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.314078 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.314633 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.814615889 +0000 UTC m=+237.910915842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.338001 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.339152 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.342218 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.346646 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.361013 4816 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-d9j8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.361064 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.415433 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j24xj\" (UniqueName: \"kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.415480 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.415586 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.415639 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.415926 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.915911305 +0000 UTC m=+238.012211258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.419299 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.419340 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.424391 4816 patch_prober.go:28] interesting pod/console-f9d7485db-nnqsw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.424452 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nnqsw" podUID="32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.434631 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5c90a604-be49-44dc-b350-9df660d8587b","Type":"ContainerStarted","Data":"de4df241bf0429f4c7d3687854c459dc61daeb2dd35192a1c0611d80c7988415"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.434682 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5c90a604-be49-44dc-b350-9df660d8587b","Type":"ContainerStarted","Data":"f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.437402 4816 generic.go:334] "Generic (PLEG): container finished" podID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerID="c46a7076608f889c8e30b77b33715ed49c92e64799e40fe88b9f99f6e980f6a5" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.437459 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" event={"ID":"59c840a8-f288-44ed-83d3-34d47041c6c6","Type":"ContainerDied","Data":"c46a7076608f889c8e30b77b33715ed49c92e64799e40fe88b9f99f6e980f6a5"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.438676 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"46c3b4df-48c2-4131-8ac4-ea6276d70d54","Type":"ContainerStarted","Data":"cf91bdab1d487b3e9c82088df50a44f305fd300ad681fc9e9b0b6cb01f350748"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.440611 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" event={"ID":"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2","Type":"ContainerStarted","Data":"4aa6683570021372f0c3ac10aa502521835a45064050e765d6899b5fc0d01fa9"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.441915 4816 generic.go:334] "Generic (PLEG): container finished" podID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerID="16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.441962 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerDied","Data":"16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.444313 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff69863d-13e1-444c-ba61-6d68a509a203" containerID="2d7e1ead92ce8010c6084321e28f13a3b17186a0141a9a086a18947183a41d47" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.444449 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerDied","Data":"2d7e1ead92ce8010c6084321e28f13a3b17186a0141a9a086a18947183a41d47"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.444480 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerStarted","Data":"bb49e18eedefb469d6109a46587df4a9ef4eb1e0a35954df9209a551fbb7b5b4"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.456097 4816 generic.go:334] "Generic (PLEG): container finished" podID="9397185e-a9e3-4ef4-b0be-d9dc9208adff" containerID="a26a0a4314d400c57bc06c18480ab7a501ebc981f4b8dbd60334dd3390aec49c" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.456251 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" event={"ID":"9397185e-a9e3-4ef4-b0be-d9dc9208adff","Type":"ContainerDied","Data":"a26a0a4314d400c57bc06c18480ab7a501ebc981f4b8dbd60334dd3390aec49c"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.465437 4816 generic.go:334] "Generic (PLEG): container finished" podID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerID="307037ba13f42f68192fcc6d4406e472de7d9aac5f7546be49cd42537db26240" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.465517 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerDied","Data":"307037ba13f42f68192fcc6d4406e472de7d9aac5f7546be49cd42537db26240"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.465539 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerStarted","Data":"e4ae76a3c7fcca7fb114ac9afc90c35f9554a0225d9ad44974098c92d5909906"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.481700 4816 generic.go:334] "Generic (PLEG): container finished" podID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerID="c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.481856 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerDied","Data":"c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.516274 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.516435 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.016413289 +0000 UTC m=+238.112713252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.516499 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.516596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.516705 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j24xj\" (UniqueName: \"kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.516729 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.516903 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.016886572 +0000 UTC m=+238.113186525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.517796 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.518061 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.524409 4816 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.541314 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j24xj\" (UniqueName: \"kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.550350 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.553806 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:44 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:44 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:44 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.553855 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.618092 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.618306 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.1182595 +0000 UTC m=+238.214559453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.618926 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.619672 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.119654038 +0000 UTC m=+238.215954081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.654336 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.697985 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.719492 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.720059 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.220039969 +0000 UTC m=+238.316339922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.732275 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.735579 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.747895 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.747901 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.748525 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.747956 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.757216 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.764050 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.764078 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.764090 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.764098 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.764218 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.764237 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.765507 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.783800 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.811316 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820423 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config\") pod \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820467 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert\") pod \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config\") pod \"59c840a8-f288-44ed-83d3-34d47041c6c6\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820779 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvtdx\" (UniqueName: \"kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx\") pod \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820812 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca\") pod \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820870 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca\") pod \"59c840a8-f288-44ed-83d3-34d47041c6c6\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820889 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles\") pod \"59c840a8-f288-44ed-83d3-34d47041c6c6\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820945 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsbsw\" (UniqueName: \"kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw\") pod \"59c840a8-f288-44ed-83d3-34d47041c6c6\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.821012 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert\") pod \"59c840a8-f288-44ed-83d3-34d47041c6c6\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.821261 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.821390 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config" (OuterVolumeSpecName: "config") pod "1d5466ab-a589-4f7e-ae89-2f494b10f6b1" (UID: "1d5466ab-a589-4f7e-ae89-2f494b10f6b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.821689 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.822170 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "59c840a8-f288-44ed-83d3-34d47041c6c6" (UID: "59c840a8-f288-44ed-83d3-34d47041c6c6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.822260 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.322247749 +0000 UTC m=+238.418547702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.822599 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca" (OuterVolumeSpecName: "client-ca") pod "59c840a8-f288-44ed-83d3-34d47041c6c6" (UID: "59c840a8-f288-44ed-83d3-34d47041c6c6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.823100 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config" (OuterVolumeSpecName: "config") pod "59c840a8-f288-44ed-83d3-34d47041c6c6" (UID: "59c840a8-f288-44ed-83d3-34d47041c6c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.824636 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d5466ab-a589-4f7e-ae89-2f494b10f6b1" (UID: "1d5466ab-a589-4f7e-ae89-2f494b10f6b1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.827994 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw" (OuterVolumeSpecName: "kube-api-access-zsbsw") pod "59c840a8-f288-44ed-83d3-34d47041c6c6" (UID: "59c840a8-f288-44ed-83d3-34d47041c6c6"). InnerVolumeSpecName "kube-api-access-zsbsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.828087 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx" (OuterVolumeSpecName: "kube-api-access-fvtdx") pod "1d5466ab-a589-4f7e-ae89-2f494b10f6b1" (UID: "1d5466ab-a589-4f7e-ae89-2f494b10f6b1"). InnerVolumeSpecName "kube-api-access-fvtdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.829085 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d5466ab-a589-4f7e-ae89-2f494b10f6b1" (UID: "1d5466ab-a589-4f7e-ae89-2f494b10f6b1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.841321 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "59c840a8-f288-44ed-83d3-34d47041c6c6" (UID: "59c840a8-f288-44ed-83d3-34d47041c6c6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.888455 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.890057 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923208 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923468 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2d9d\" (UniqueName: \"kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923597 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923611 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923621 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923630 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsbsw\" (UniqueName: \"kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923639 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923647 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923657 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923667 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvtdx\" (UniqueName: \"kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.924195 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.424161842 +0000 UTC m=+238.520461815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.930753 4816 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-16T00:10:44.524432908Z","Handler":null,"Name":""} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.934932 4816 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.934968 4816 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.025031 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.025103 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.025166 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2d9d\" (UniqueName: \"kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.025245 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.025968 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.026830 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.029089 4816 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.029121 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.048520 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2d9d\" (UniqueName: \"kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.077501 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.098390 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.128852 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.136649 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.139650 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.144804 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.227253 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.232482 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.276307 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.280156 4816 ???:1] "http: TLS handshake error from 192.168.126.11:55002: no serving certificate available for the kubelet" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.332832 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.379827 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.380867 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.387311 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.389107 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.398606 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.412329 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.494157 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" event={"ID":"1d5466ab-a589-4f7e-ae89-2f494b10f6b1","Type":"ContainerDied","Data":"8bd92ab2e8746013ff96fbb3362f4a912a98fe884156f1b95b8704505ab4fe1a"} Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.494233 4816 scope.go:117] "RemoveContainer" containerID="e90fdfac87f05e45b64d63ce5cb4d5902fbd18d9c1d580577069351527db0c29" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.494397 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.512210 4816 generic.go:334] "Generic (PLEG): container finished" podID="5c90a604-be49-44dc-b350-9df660d8587b" containerID="de4df241bf0429f4c7d3687854c459dc61daeb2dd35192a1c0611d80c7988415" exitCode=0 Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.512277 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5c90a604-be49-44dc-b350-9df660d8587b","Type":"ContainerDied","Data":"de4df241bf0429f4c7d3687854c459dc61daeb2dd35192a1c0611d80c7988415"} Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.525837 4816 generic.go:334] "Generic (PLEG): container finished" podID="46c3b4df-48c2-4131-8ac4-ea6276d70d54" containerID="cf91bdab1d487b3e9c82088df50a44f305fd300ad681fc9e9b0b6cb01f350748" exitCode=0 Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.526009 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"46c3b4df-48c2-4131-8ac4-ea6276d70d54","Type":"ContainerDied","Data":"cf91bdab1d487b3e9c82088df50a44f305fd300ad681fc9e9b0b6cb01f350748"} Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.529675 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.529723 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" event={"ID":"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2","Type":"ContainerStarted","Data":"aa1149a96b819d638525f60a56566f3acd4e8b31d993c4dcdb3dcaa8740d2a99"} Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.535108 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.535491 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" event={"ID":"59c840a8-f288-44ed-83d3-34d47041c6c6","Type":"ContainerDied","Data":"360f090f6a27a9d9ebb782602e54104c845a3d5e91127b115ef7d468e384ebfe"} Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.535798 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538491 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538567 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnvd7\" (UniqueName: \"kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538591 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6qg\" (UniqueName: \"kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538709 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538801 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538868 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538965 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.539033 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.558773 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:45 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:45 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:45 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.558853 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.602171 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" podStartSLOduration=14.602152913 podStartE2EDuration="14.602152913s" podCreationTimestamp="2026-03-16 00:10:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:45.600293382 +0000 UTC m=+238.696593335" watchObservedRunningTime="2026-03-16 00:10:45.602152913 +0000 UTC m=+238.698452866" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.640679 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.640739 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641485 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641586 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnvd7\" (UniqueName: \"kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641612 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6qg\" (UniqueName: \"kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641667 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641693 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641820 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.642967 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.643239 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.644113 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.644145 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.644206 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.661121 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.670905 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6qg\" (UniqueName: \"kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.677916 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" path="/var/lib/kubelet/pods/1d5466ab-a589-4f7e-ae89-2f494b10f6b1/volumes" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.686378 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.690000 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnvd7\" (UniqueName: \"kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.700853 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.705905 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.714404 4816 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tv2n7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.714480 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.729261 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.730162 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.730193 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.738802 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.740400 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.743233 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.748403 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.843668 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.843728 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrpff\" (UniqueName: \"kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.843765 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.945336 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.945396 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrpff\" (UniqueName: \"kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.945436 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.945862 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.946143 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.974667 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrpff\" (UniqueName: \"kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.082756 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.136737 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.137746 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.148417 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.249143 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk7j9\" (UniqueName: \"kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.249238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.249269 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.351124 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk7j9\" (UniqueName: \"kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.351224 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.351242 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.351771 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.351784 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.369027 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk7j9\" (UniqueName: \"kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.471669 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.554138 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:46 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:46 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:46 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.554387 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:47 crc kubenswrapper[4816]: I0316 00:10:47.553283 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:47 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:47 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:47 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:47 crc kubenswrapper[4816]: I0316 00:10:47.553365 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:47 crc kubenswrapper[4816]: I0316 00:10:47.685573 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" path="/var/lib/kubelet/pods/59c840a8-f288-44ed-83d3-34d47041c6c6/volumes" Mar 16 00:10:48 crc kubenswrapper[4816]: I0316 00:10:48.333195 4816 ???:1] "http: TLS handshake error from 192.168.126.11:52476: no serving certificate available for the kubelet" Mar 16 00:10:48 crc kubenswrapper[4816]: I0316 00:10:48.553620 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:48 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:48 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:48 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:48 crc kubenswrapper[4816]: I0316 00:10:48.553682 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:48 crc kubenswrapper[4816]: I0316 00:10:48.954294 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:48 crc kubenswrapper[4816]: I0316 00:10:48.961503 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:49 crc kubenswrapper[4816]: W0316 00:10:49.264622 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5ba22dd_8e8e_4beb_a540_e5c9687810b8.slice/crio-7718a309c71ba8a48a463087b2e901f51d954ea050a7be786e3c0a847d6a54eb WatchSource:0}: Error finding container 7718a309c71ba8a48a463087b2e901f51d954ea050a7be786e3c0a847d6a54eb: Status 404 returned error can't find the container with id 7718a309c71ba8a48a463087b2e901f51d954ea050a7be786e3c0a847d6a54eb Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.273173 4816 scope.go:117] "RemoveContainer" containerID="c46a7076608f889c8e30b77b33715ed49c92e64799e40fe88b9f99f6e980f6a5" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.335835 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.360075 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.363945 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.392447 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access\") pod \"5c90a604-be49-44dc-b350-9df660d8587b\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.392496 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir\") pod \"5c90a604-be49-44dc-b350-9df660d8587b\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.392873 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5c90a604-be49-44dc-b350-9df660d8587b" (UID: "5c90a604-be49-44dc-b350-9df660d8587b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.398276 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5c90a604-be49-44dc-b350-9df660d8587b" (UID: "5c90a604-be49-44dc-b350-9df660d8587b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494102 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume\") pod \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494144 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access\") pod \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494168 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdffs\" (UniqueName: \"kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs\") pod \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494206 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir\") pod \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494279 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume\") pod \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494586 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494607 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.495043 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume" (OuterVolumeSpecName: "config-volume") pod "9397185e-a9e3-4ef4-b0be-d9dc9208adff" (UID: "9397185e-a9e3-4ef4-b0be-d9dc9208adff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.495109 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "46c3b4df-48c2-4131-8ac4-ea6276d70d54" (UID: "46c3b4df-48c2-4131-8ac4-ea6276d70d54"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.498146 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs" (OuterVolumeSpecName: "kube-api-access-pdffs") pod "9397185e-a9e3-4ef4-b0be-d9dc9208adff" (UID: "9397185e-a9e3-4ef4-b0be-d9dc9208adff"). InnerVolumeSpecName "kube-api-access-pdffs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.498304 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "46c3b4df-48c2-4131-8ac4-ea6276d70d54" (UID: "46c3b4df-48c2-4131-8ac4-ea6276d70d54"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.498733 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9397185e-a9e3-4ef4-b0be-d9dc9208adff" (UID: "9397185e-a9e3-4ef4-b0be-d9dc9208adff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.555612 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:49 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:49 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:49 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.556458 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.558603 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" event={"ID":"9397185e-a9e3-4ef4-b0be-d9dc9208adff","Type":"ContainerDied","Data":"b19b5574ead1cf818c519a7ffb8ef773b81e380296fd94d88cb6d44a3be77066"} Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.558648 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b19b5574ead1cf818c519a7ffb8ef773b81e380296fd94d88cb6d44a3be77066" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.558626 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.561468 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5c90a604-be49-44dc-b350-9df660d8587b","Type":"ContainerDied","Data":"f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0"} Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.561495 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.561581 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.581315 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.581307 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"46c3b4df-48c2-4131-8ac4-ea6276d70d54","Type":"ContainerDied","Data":"2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157"} Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.581581 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.588383 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerStarted","Data":"7718a309c71ba8a48a463087b2e901f51d954ea050a7be786e3c0a847d6a54eb"} Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.595988 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.596017 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.596031 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.596043 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.596055 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdffs\" (UniqueName: \"kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:50 crc kubenswrapper[4816]: I0316 00:10:50.427998 4816 ???:1] "http: TLS handshake error from 192.168.126.11:52478: no serving certificate available for the kubelet" Mar 16 00:10:50 crc kubenswrapper[4816]: I0316 00:10:50.559902 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:50 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:50 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:50 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:50 crc kubenswrapper[4816]: I0316 00:10:50.559989 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:50 crc kubenswrapper[4816]: I0316 00:10:50.584232 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-npvts" Mar 16 00:10:51 crc kubenswrapper[4816]: I0316 00:10:51.553632 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:51 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:51 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:51 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:51 crc kubenswrapper[4816]: I0316 00:10:51.553689 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:52 crc kubenswrapper[4816]: I0316 00:10:52.556586 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:52 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:52 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:52 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:52 crc kubenswrapper[4816]: I0316 00:10:52.556919 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.246477 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.248233 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.267666 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.384527 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.393149 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.552771 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:53 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:53 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:53 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.552845 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.420048 4816 patch_prober.go:28] interesting pod/console-f9d7485db-nnqsw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.420677 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nnqsw" podUID="32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.500254 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.561721 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:54 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:54 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:54 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.561793 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.581057 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.595518 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.618856 4816 generic.go:334] "Generic (PLEG): container finished" podID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerID="f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f" exitCode=0 Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.618939 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerDied","Data":"f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f"} Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.620214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerStarted","Data":"dfccefcb0f8e6864404f0a8715036becb9b7ec4a3aef59dca2da5e935bde36d5"} Mar 16 00:10:54 crc kubenswrapper[4816]: W0316 00:10:54.631372 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c061cd_ed29_4f9c_ad3a_0fb204ce6f8d.slice/crio-e26612b919b84db051d8d1f5b5f0b9a5f292e7d09dcad15803116b4fcd5c25d6 WatchSource:0}: Error finding container e26612b919b84db051d8d1f5b5f0b9a5f292e7d09dcad15803116b4fcd5c25d6: Status 404 returned error can't find the container with id e26612b919b84db051d8d1f5b5f0b9a5f292e7d09dcad15803116b4fcd5c25d6 Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.669984 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.688878 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.736972 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jqsjn"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.747831 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.747868 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.747868 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.747956 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:54 crc kubenswrapper[4816]: W0316 00:10:54.763135 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84360ef9_0450_44c5_80eb_eab1bf8e808b.slice/crio-f8a8d5047faf25a14762538ff1be7c473a628d88a02f801a939974322aa4c0fd WatchSource:0}: Error finding container f8a8d5047faf25a14762538ff1be7c473a628d88a02f801a939974322aa4c0fd: Status 404 returned error can't find the container with id f8a8d5047faf25a14762538ff1be7c473a628d88a02f801a939974322aa4c0fd Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.768076 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.554143 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:55 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:55 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:55 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.554885 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.629768 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" event={"ID":"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d","Type":"ContainerStarted","Data":"8fdaf15dae2b09a126e743d43c57d752450921c814448bf980f67e094859a0df"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.629819 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" event={"ID":"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d","Type":"ContainerStarted","Data":"e26612b919b84db051d8d1f5b5f0b9a5f292e7d09dcad15803116b4fcd5c25d6"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.631350 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerStarted","Data":"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.631374 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerStarted","Data":"47689d47c5b861a3bd4357a2faba7a8ab87d56775475b31d461c37bf8423f524"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.635528 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" event={"ID":"1feb17b5-7946-4727-a954-d516a9b8469b","Type":"ContainerStarted","Data":"ed184c9fb61316e6cf511969f5857732bc0cbe98f1fc984c31c588b0377ff308"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.635613 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" event={"ID":"1feb17b5-7946-4727-a954-d516a9b8469b","Type":"ContainerStarted","Data":"2b950732b2a5bf6036d818014b94cf2a7cdbaaa448fc7e9ce26ccb0e98f8f687"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.637941 4816 generic.go:334] "Generic (PLEG): container finished" podID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerID="5259cd97d29c896bcf8ba7141fe44641e990295b28288f54dfe4315de536ad23" exitCode=0 Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.637996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerDied","Data":"5259cd97d29c896bcf8ba7141fe44641e990295b28288f54dfe4315de536ad23"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.641075 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" event={"ID":"b155133b-d494-44bc-aa5d-23efc7cbd7a6","Type":"ContainerStarted","Data":"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.641120 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" event={"ID":"b155133b-d494-44bc-aa5d-23efc7cbd7a6","Type":"ContainerStarted","Data":"e368502f9ca177437add127848813e2ad33e96c185b8ab726042b2878dcec995"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.643724 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" event={"ID":"84360ef9-0450-44c5-80eb-eab1bf8e808b","Type":"ContainerStarted","Data":"af04d6d535b6c5f27d1a59ca3be2b5e9cd465bd48de2b8ba8e2eb0e281a5d8ac"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.643753 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" event={"ID":"84360ef9-0450-44c5-80eb-eab1bf8e808b","Type":"ContainerStarted","Data":"f8a8d5047faf25a14762538ff1be7c473a628d88a02f801a939974322aa4c0fd"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.647432 4816 generic.go:334] "Generic (PLEG): container finished" podID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerID="d3d02136defedca51b696822546773a5d6f3e05f0581bc5504bae4a17393efcc" exitCode=0 Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.647475 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerDied","Data":"d3d02136defedca51b696822546773a5d6f3e05f0581bc5504bae4a17393efcc"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.647504 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerStarted","Data":"a7d840d19860a5867af8d4206630041069552968b0c74710a21974d2b8f8f661"} Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.552056 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:56 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:56 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:56 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.552123 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.656529 4816 generic.go:334] "Generic (PLEG): container finished" podID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerID="056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503" exitCode=0 Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.656604 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerDied","Data":"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503"} Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.657083 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.675979 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" podStartSLOduration=13.675958219 podStartE2EDuration="13.675958219s" podCreationTimestamp="2026-03-16 00:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:56.673928353 +0000 UTC m=+249.770228326" watchObservedRunningTime="2026-03-16 00:10:56.675958219 +0000 UTC m=+249.772258172" Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.716741 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" podStartSLOduration=198.716718481 podStartE2EDuration="3m18.716718481s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:56.714326976 +0000 UTC m=+249.810626949" watchObservedRunningTime="2026-03-16 00:10:56.716718481 +0000 UTC m=+249.813018434" Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.772592 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podStartSLOduration=13.772574126 podStartE2EDuration="13.772574126s" podCreationTimestamp="2026-03-16 00:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:56.770073358 +0000 UTC m=+249.866373311" watchObservedRunningTime="2026-03-16 00:10:56.772574126 +0000 UTC m=+249.868874079" Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.555012 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:57 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:57 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:57 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.555365 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.698270 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" event={"ID":"84360ef9-0450-44c5-80eb-eab1bf8e808b","Type":"ContainerStarted","Data":"c025ab66399e1eec77c882d25daa4e18498f64990e5075d55e63826832d6af3d"} Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.712988 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-44pts" event={"ID":"55e76e8f-7d69-4f55-81f8-45c9c612876b","Type":"ContainerStarted","Data":"a0546877ac51e8fef907f2152b03530a1aaadfb1ec0bb2cad119c19beb5651ba"} Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.756736 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560330-44pts" podStartSLOduration=36.693446902 podStartE2EDuration="57.756702386s" podCreationTimestamp="2026-03-16 00:10:00 +0000 UTC" firstStartedPulling="2026-03-16 00:10:35.702247599 +0000 UTC m=+228.798547552" lastFinishedPulling="2026-03-16 00:10:56.765503093 +0000 UTC m=+249.861803036" observedRunningTime="2026-03-16 00:10:57.755651037 +0000 UTC m=+250.851950990" watchObservedRunningTime="2026-03-16 00:10:57.756702386 +0000 UTC m=+250.853002339" Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.772835 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jqsjn" podStartSLOduration=200.772814736 podStartE2EDuration="3m20.772814736s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:57.770612726 +0000 UTC m=+250.866912699" watchObservedRunningTime="2026-03-16 00:10:57.772814736 +0000 UTC m=+250.869114689" Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.972261 4816 csr.go:261] certificate signing request csr-kjjbq is approved, waiting to be issued Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.980628 4816 csr.go:257] certificate signing request csr-kjjbq is issued Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.554062 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:58 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:58 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:58 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.554122 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.723606 4816 generic.go:334] "Generic (PLEG): container finished" podID="55e76e8f-7d69-4f55-81f8-45c9c612876b" containerID="a0546877ac51e8fef907f2152b03530a1aaadfb1ec0bb2cad119c19beb5651ba" exitCode=0 Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.723668 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-44pts" event={"ID":"55e76e8f-7d69-4f55-81f8-45c9c612876b","Type":"ContainerDied","Data":"a0546877ac51e8fef907f2152b03530a1aaadfb1ec0bb2cad119c19beb5651ba"} Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.981684 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-30 23:38:08.310272768 +0000 UTC Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.981723 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6959h27m9.328552206s for next certificate rotation Mar 16 00:10:59 crc kubenswrapper[4816]: I0316 00:10:59.553783 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:59 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:59 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:59 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:59 crc kubenswrapper[4816]: I0316 00:10:59.553840 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:59 crc kubenswrapper[4816]: I0316 00:10:59.982593 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-29 11:23:32.366681093 +0000 UTC Mar 16 00:10:59 crc kubenswrapper[4816]: I0316 00:10:59.982633 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6203h12m32.384051423s for next certificate rotation Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.053345 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.153659 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq52v\" (UniqueName: \"kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v\") pod \"55e76e8f-7d69-4f55-81f8-45c9c612876b\" (UID: \"55e76e8f-7d69-4f55-81f8-45c9c612876b\") " Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.160775 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v" (OuterVolumeSpecName: "kube-api-access-dq52v") pod "55e76e8f-7d69-4f55-81f8-45c9c612876b" (UID: "55e76e8f-7d69-4f55-81f8-45c9c612876b"). InnerVolumeSpecName "kube-api-access-dq52v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.254890 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq52v\" (UniqueName: \"kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.554560 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.558587 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.746262 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.746278 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-44pts" event={"ID":"55e76e8f-7d69-4f55-81f8-45c9c612876b","Type":"ContainerDied","Data":"14379482594ebf801c25583d0aab03c78f3555265f22f25f8cbeb498177ecef2"} Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.746338 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14379482594ebf801c25583d0aab03c78f3555265f22f25f8cbeb498177ecef2" Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.863033 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.863087 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.886477 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.887057 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" containerID="cri-o://ed184c9fb61316e6cf511969f5857732bc0cbe98f1fc984c31c588b0377ff308" gracePeriod=30 Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.890011 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.893158 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.893408 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" containerID="cri-o://8fdaf15dae2b09a126e743d43c57d752450921c814448bf980f67e094859a0df" gracePeriod=30 Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.897737 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.910852 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.926931 4816 patch_prober.go:28] interesting pod/controller-manager-64d6fc58d9-vgh6t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:54684->10.217.0.54:8443: read: connection reset by peer" start-of-body= Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.926991 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:54684->10.217.0.54:8443: read: connection reset by peer" Mar 16 00:11:02 crc kubenswrapper[4816]: I0316 00:11:02.760480 4816 generic.go:334] "Generic (PLEG): container finished" podID="1feb17b5-7946-4727-a954-d516a9b8469b" containerID="ed184c9fb61316e6cf511969f5857732bc0cbe98f1fc984c31c588b0377ff308" exitCode=0 Mar 16 00:11:02 crc kubenswrapper[4816]: I0316 00:11:02.760587 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" event={"ID":"1feb17b5-7946-4727-a954-d516a9b8469b","Type":"ContainerDied","Data":"ed184c9fb61316e6cf511969f5857732bc0cbe98f1fc984c31c588b0377ff308"} Mar 16 00:11:02 crc kubenswrapper[4816]: I0316 00:11:02.762813 4816 generic.go:334] "Generic (PLEG): container finished" podID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerID="8fdaf15dae2b09a126e743d43c57d752450921c814448bf980f67e094859a0df" exitCode=0 Mar 16 00:11:02 crc kubenswrapper[4816]: I0316 00:11:02.762841 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" event={"ID":"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d","Type":"ContainerDied","Data":"8fdaf15dae2b09a126e743d43c57d752450921c814448bf980f67e094859a0df"} Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.419263 4816 patch_prober.go:28] interesting pod/console-f9d7485db-nnqsw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.419324 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nnqsw" podUID="32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749012 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749050 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749078 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749152 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749234 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749952 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.750169 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.750413 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"9e1758947a169fa8c89c8e3873ca56d930c8ca55c7c143100afca371ccc218fc"} pod="openshift-console/downloads-7954f5f757-5rr7c" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.750772 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" containerID="cri-o://9e1758947a169fa8c89c8e3873ca56d930c8ca55c7c143100afca371ccc218fc" gracePeriod=2 Mar 16 00:11:05 crc kubenswrapper[4816]: I0316 00:11:05.707415 4816 patch_prober.go:28] interesting pod/route-controller-manager-6d597ffc5b-jhblv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 16 00:11:05 crc kubenswrapper[4816]: I0316 00:11:05.707777 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 16 00:11:05 crc kubenswrapper[4816]: I0316 00:11:05.782819 4816 generic.go:334] "Generic (PLEG): container finished" podID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerID="9e1758947a169fa8c89c8e3873ca56d930c8ca55c7c143100afca371ccc218fc" exitCode=0 Mar 16 00:11:05 crc kubenswrapper[4816]: I0316 00:11:05.782864 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5rr7c" event={"ID":"0ec3cdc0-f024-43cf-b520-7d2437e0f8df","Type":"ContainerDied","Data":"9e1758947a169fa8c89c8e3873ca56d930c8ca55c7c143100afca371ccc218fc"} Mar 16 00:11:06 crc kubenswrapper[4816]: I0316 00:11:06.701642 4816 patch_prober.go:28] interesting pod/controller-manager-64d6fc58d9-vgh6t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:11:06 crc kubenswrapper[4816]: I0316 00:11:06.701791 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:11:14 crc kubenswrapper[4816]: I0316 00:11:14.423798 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:11:14 crc kubenswrapper[4816]: I0316 00:11:14.432060 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:11:14 crc kubenswrapper[4816]: I0316 00:11:14.747970 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:14 crc kubenswrapper[4816]: I0316 00:11:14.748331 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.167733 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.245997 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.283970 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:11:15 crc kubenswrapper[4816]: E0316 00:11:15.284243 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c90a604-be49-44dc-b350-9df660d8587b" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284259 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c90a604-be49-44dc-b350-9df660d8587b" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: E0316 00:11:15.284273 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c3b4df-48c2-4131-8ac4-ea6276d70d54" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284282 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c3b4df-48c2-4131-8ac4-ea6276d70d54" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: E0316 00:11:15.284302 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e76e8f-7d69-4f55-81f8-45c9c612876b" containerName="oc" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284311 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e76e8f-7d69-4f55-81f8-45c9c612876b" containerName="oc" Mar 16 00:11:15 crc kubenswrapper[4816]: E0316 00:11:15.284325 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9397185e-a9e3-4ef4-b0be-d9dc9208adff" containerName="collect-profiles" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284333 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9397185e-a9e3-4ef4-b0be-d9dc9208adff" containerName="collect-profiles" Mar 16 00:11:15 crc kubenswrapper[4816]: E0316 00:11:15.284342 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284350 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284468 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c90a604-be49-44dc-b350-9df660d8587b" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284483 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284496 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e76e8f-7d69-4f55-81f8-45c9c612876b" containerName="oc" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284510 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c3b4df-48c2-4131-8ac4-ea6276d70d54" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284524 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9397185e-a9e3-4ef4-b0be-d9dc9208adff" containerName="collect-profiles" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.292330 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.292373 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.293223 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401319 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca\") pod \"1feb17b5-7946-4727-a954-d516a9b8469b\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401369 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert\") pod \"1feb17b5-7946-4727-a954-d516a9b8469b\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401437 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnvd7\" (UniqueName: \"kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7\") pod \"1feb17b5-7946-4727-a954-d516a9b8469b\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401479 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config\") pod \"1feb17b5-7946-4727-a954-d516a9b8469b\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401505 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles\") pod \"1feb17b5-7946-4727-a954-d516a9b8469b\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401728 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401768 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl55f\" (UniqueName: \"kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402034 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402137 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402219 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca" (OuterVolumeSpecName: "client-ca") pod "1feb17b5-7946-4727-a954-d516a9b8469b" (UID: "1feb17b5-7946-4727-a954-d516a9b8469b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402296 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config" (OuterVolumeSpecName: "config") pod "1feb17b5-7946-4727-a954-d516a9b8469b" (UID: "1feb17b5-7946-4727-a954-d516a9b8469b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402471 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1feb17b5-7946-4727-a954-d516a9b8469b" (UID: "1feb17b5-7946-4727-a954-d516a9b8469b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.406394 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1feb17b5-7946-4727-a954-d516a9b8469b" (UID: "1feb17b5-7946-4727-a954-d516a9b8469b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.407527 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7" (OuterVolumeSpecName: "kube-api-access-jnvd7") pod "1feb17b5-7946-4727-a954-d516a9b8469b" (UID: "1feb17b5-7946-4727-a954-d516a9b8469b"). InnerVolumeSpecName "kube-api-access-jnvd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.503970 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504038 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl55f\" (UniqueName: \"kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504077 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504106 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504159 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504236 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504250 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504264 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnvd7\" (UniqueName: \"kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504277 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504289 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.507481 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.508875 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.509274 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.511049 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.520690 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl55f\" (UniqueName: \"kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.610025 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.707185 4816 patch_prober.go:28] interesting pod/route-controller-manager-6d597ffc5b-jhblv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.707249 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.844398 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" event={"ID":"1feb17b5-7946-4727-a954-d516a9b8469b","Type":"ContainerDied","Data":"2b950732b2a5bf6036d818014b94cf2a7cdbaaa448fc7e9ce26ccb0e98f8f687"} Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.844452 4816 scope.go:117] "RemoveContainer" containerID="ed184c9fb61316e6cf511969f5857732bc0cbe98f1fc984c31c588b0377ff308" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.845154 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.868569 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.875524 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.001024 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.001920 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.006372 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.006496 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.011685 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.152898 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.152990 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.254817 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.254909 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.255001 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.271224 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.324958 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.678243 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" path="/var/lib/kubelet/pods/1feb17b5-7946-4727-a954-d516a9b8469b/volumes" Mar 16 00:11:21 crc kubenswrapper[4816]: I0316 00:11:21.908472 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:11:21 crc kubenswrapper[4816]: I0316 00:11:21.963885 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:11:21 crc kubenswrapper[4816]: I0316 00:11:21.964806 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:21 crc kubenswrapper[4816]: I0316 00:11:21.982913 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.017743 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.017843 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.017865 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.119288 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.119380 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.119408 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.119416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.119468 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.136380 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.279475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:24 crc kubenswrapper[4816]: I0316 00:11:24.747496 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:24 crc kubenswrapper[4816]: I0316 00:11:24.747905 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:26 crc kubenswrapper[4816]: I0316 00:11:26.707510 4816 patch_prober.go:28] interesting pod/route-controller-manager-6d597ffc5b-jhblv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:11:28 crc kubenswrapper[4816]: I0316 00:11:26.707617 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:11:28 crc kubenswrapper[4816]: E0316 00:11:27.895891 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 16 00:11:28 crc kubenswrapper[4816]: E0316 00:11:27.896079 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gwq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8q2xw_openshift-marketplace(bf586c6e-f957-46fc-8140-f9a9ea22510f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:28 crc kubenswrapper[4816]: E0316 00:11:27.897331 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8q2xw" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" Mar 16 00:11:31 crc kubenswrapper[4816]: I0316 00:11:31.863311 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:11:31 crc kubenswrapper[4816]: I0316 00:11:31.863818 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:11:32 crc kubenswrapper[4816]: E0316 00:11:32.843728 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 16 00:11:32 crc kubenswrapper[4816]: E0316 00:11:32.843903 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qk7j9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hvpqn_openshift-marketplace(cc1ea93d-1cf8-4145-ad35-83f2d1357f9d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:32 crc kubenswrapper[4816]: E0316 00:11:32.845160 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hvpqn" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" Mar 16 00:11:34 crc kubenswrapper[4816]: I0316 00:11:34.748540 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:34 crc kubenswrapper[4816]: I0316 00:11:34.749705 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:36 crc kubenswrapper[4816]: I0316 00:11:36.707469 4816 patch_prober.go:28] interesting pod/route-controller-manager-6d597ffc5b-jhblv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: i/o timeout" start-of-body= Mar 16 00:11:36 crc kubenswrapper[4816]: I0316 00:11:36.707817 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: i/o timeout" Mar 16 00:11:37 crc kubenswrapper[4816]: E0316 00:11:37.852354 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 16 00:11:37 crc kubenswrapper[4816]: E0316 00:11:37.852808 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4mjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wh2h7_openshift-marketplace(b1b3efd0-cdc0-4973-8077-bcd1ea567bdd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:37 crc kubenswrapper[4816]: E0316 00:11:37.854365 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wh2h7" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" Mar 16 00:11:39 crc kubenswrapper[4816]: E0316 00:11:39.385971 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wh2h7" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" Mar 16 00:11:40 crc kubenswrapper[4816]: E0316 00:11:40.052943 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 16 00:11:40 crc kubenswrapper[4816]: E0316 00:11:40.053289 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-45bbd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4gwcw_openshift-marketplace(ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:40 crc kubenswrapper[4816]: E0316 00:11:40.054441 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4gwcw" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.822687 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4gwcw" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.883993 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.884229 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mrpff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-52qs6_openshift-marketplace(6ca6c2c9-3a12-4eb3-9df1-7fdea640791d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.886090 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-52qs6" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.915109 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.915914 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.916097 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2d9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6gbkl_openshift-marketplace(bad7b5f7-88a8-4c20-a010-734a46f59e05): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.918358 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6gbkl" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.950409 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.951799 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.951826 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.968411 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.969343 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.970311 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.016881 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" event={"ID":"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d","Type":"ContainerDied","Data":"e26612b919b84db051d8d1f5b5f0b9a5f292e7d09dcad15803116b4fcd5c25d6"} Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.016989 4816 scope.go:117] "RemoveContainer" containerID="8fdaf15dae2b09a126e743d43c57d752450921c814448bf980f67e094859a0df" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.017224 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:11:42 crc kubenswrapper[4816]: E0316 00:11:42.020298 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6gbkl" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" Mar 16 00:11:42 crc kubenswrapper[4816]: E0316 00:11:42.020772 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-52qs6" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.037542 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert\") pod \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.037785 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca\") pod \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.037862 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config\") pod \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.037928 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt6qg\" (UniqueName: \"kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg\") pod \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.046274 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config" (OuterVolumeSpecName: "config") pod "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" (UID: "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.046266 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" (UID: "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.050172 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" (UID: "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.050256 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg" (OuterVolumeSpecName: "kube-api-access-qt6qg") pod "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" (UID: "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d"). InnerVolumeSpecName "kube-api-access-qt6qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.140545 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snc4z\" (UniqueName: \"kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141156 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141189 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141216 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141301 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141317 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141327 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141338 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt6qg\" (UniqueName: \"kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.201415 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:11:42 crc kubenswrapper[4816]: W0316 00:11:42.207634 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a656b3_b809_44ee_bcc8_cfec2ae27ec7.slice/crio-29366548a9919c43dc76196eb98a6cdaf85a57ba7ece5ccd1f7de91db89bc729 WatchSource:0}: Error finding container 29366548a9919c43dc76196eb98a6cdaf85a57ba7ece5ccd1f7de91db89bc729: Status 404 returned error can't find the container with id 29366548a9919c43dc76196eb98a6cdaf85a57ba7ece5ccd1f7de91db89bc729 Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.243172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.243218 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.243252 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.243315 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snc4z\" (UniqueName: \"kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.244863 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.245296 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.252198 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.263821 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snc4z\" (UniqueName: \"kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.313013 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.345598 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.348896 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:11:42 crc kubenswrapper[4816]: E0316 00:11:42.387828 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 16 00:11:42 crc kubenswrapper[4816]: E0316 00:11:42.387982 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j24xj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7pb49_openshift-marketplace(a5ba22dd-8e8e-4beb-a540-e5c9687810b8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:42 crc kubenswrapper[4816]: E0316 00:11:42.389190 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7pb49" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.473489 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.485197 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.541531 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.022463 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" event={"ID":"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7","Type":"ContainerStarted","Data":"1622414dc19bed94547791fb46aea5e67b087d0109b950fdff872fc6af3fe300"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.022826 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.022838 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" event={"ID":"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7","Type":"ContainerStarted","Data":"29366548a9919c43dc76196eb98a6cdaf85a57ba7ece5ccd1f7de91db89bc729"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.022544 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" containerID="cri-o://1622414dc19bed94547791fb46aea5e67b087d0109b950fdff872fc6af3fe300" gracePeriod=30 Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.025180 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" event={"ID":"d843d76b-9317-42aa-848b-e3e11c3106cb","Type":"ContainerStarted","Data":"f342289f8f13f5d89f00dac92a6b213282ffa583b6c1a48b772dae90dc55fd82"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.026109 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a","Type":"ContainerStarted","Data":"934833059dddc073b4415862e96aafb4ed1091c7c9cabca244d231fa20d34d92"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.027758 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a9416716-e666-46d6-9d77-fe5c9702c035","Type":"ContainerStarted","Data":"dd148857a3f8f3853eb8381f642acb80c9aad6dc4ab5491e0ecfe89f172f60d6"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.033306 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5rr7c" event={"ID":"0ec3cdc0-f024-43cf-b520-7d2437e0f8df","Type":"ContainerStarted","Data":"61c9642fe76a811268f1e4f78cdfc2538ce75e905554ea5575dfd07e151f6573"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.034935 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.034972 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:43 crc kubenswrapper[4816]: E0316 00:11:43.035904 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7pb49" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.061691 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" podStartSLOduration=42.061674202 podStartE2EDuration="42.061674202s" podCreationTimestamp="2026-03-16 00:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:43.041486147 +0000 UTC m=+296.137786100" watchObservedRunningTime="2026-03-16 00:11:43.061674202 +0000 UTC m=+296.157974155" Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.094799 4816 patch_prober.go:28] interesting pod/controller-manager-7dc5f6d8dc-9cs6b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": EOF" start-of-body= Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.094856 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": EOF" Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.672918 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" path="/var/lib/kubelet/pods/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d/volumes" Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.040614 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" event={"ID":"d843d76b-9317-42aa-848b-e3e11c3106cb","Type":"ContainerStarted","Data":"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634"} Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.043350 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a","Type":"ContainerStarted","Data":"29ea047cfa535477d88409add4c285e480d3dd9e6f79bea9d43c76200c1b38cb"} Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.043843 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.043911 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.044192 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.747989 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.748083 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.747989 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.748322 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4816]: E0316 00:11:44.835306 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 16 00:11:44 crc kubenswrapper[4816]: E0316 00:11:44.835494 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p8frh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bkxpc_openshift-marketplace(ff69863d-13e1-444c-ba61-6d68a509a203): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:44 crc kubenswrapper[4816]: E0316 00:11:44.837002 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bkxpc" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.050532 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a9416716-e666-46d6-9d77-fe5c9702c035","Type":"ContainerStarted","Data":"dd1bd79082698289c067776233601bb17f6ba8cbb98dcb745b3329e5f4f6fb1f"} Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.052684 4816 generic.go:334] "Generic (PLEG): container finished" podID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerID="1622414dc19bed94547791fb46aea5e67b087d0109b950fdff872fc6af3fe300" exitCode=0 Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.052793 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" event={"ID":"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7","Type":"ContainerDied","Data":"1622414dc19bed94547791fb46aea5e67b087d0109b950fdff872fc6af3fe300"} Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.053262 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.053326 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4816]: E0316 00:11:45.054585 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bkxpc" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.080731 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" podStartSLOduration=23.080714836 podStartE2EDuration="23.080714836s" podCreationTimestamp="2026-03-16 00:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:45.079044941 +0000 UTC m=+298.175344924" watchObservedRunningTime="2026-03-16 00:11:45.080714836 +0000 UTC m=+298.177014789" Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.147783 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=29.14776301 podStartE2EDuration="29.14776301s" podCreationTimestamp="2026-03-16 00:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:45.141470557 +0000 UTC m=+298.237770520" watchObservedRunningTime="2026-03-16 00:11:45.14776301 +0000 UTC m=+298.244062983" Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.611162 4816 patch_prober.go:28] interesting pod/controller-manager-7dc5f6d8dc-9cs6b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.611232 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4816]: I0316 00:11:46.060212 4816 generic.go:334] "Generic (PLEG): container finished" podID="2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" containerID="29ea047cfa535477d88409add4c285e480d3dd9e6f79bea9d43c76200c1b38cb" exitCode=0 Mar 16 00:11:46 crc kubenswrapper[4816]: I0316 00:11:46.060308 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a","Type":"ContainerDied","Data":"29ea047cfa535477d88409add4c285e480d3dd9e6f79bea9d43c76200c1b38cb"} Mar 16 00:11:46 crc kubenswrapper[4816]: I0316 00:11:46.107395 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=25.107377966 podStartE2EDuration="25.107377966s" podCreationTimestamp="2026-03-16 00:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:46.104713453 +0000 UTC m=+299.201013416" watchObservedRunningTime="2026-03-16 00:11:46.107377966 +0000 UTC m=+299.203677919" Mar 16 00:11:52 crc kubenswrapper[4816]: I0316 00:11:52.313720 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:52 crc kubenswrapper[4816]: I0316 00:11:52.322434 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:54 crc kubenswrapper[4816]: I0316 00:11:54.747984 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:54 crc kubenswrapper[4816]: I0316 00:11:54.748809 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:54 crc kubenswrapper[4816]: I0316 00:11:54.748042 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:54 crc kubenswrapper[4816]: I0316 00:11:54.748946 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4816]: I0316 00:11:56.610777 4816 patch_prober.go:28] interesting pod/controller-manager-7dc5f6d8dc-9cs6b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: i/o timeout" start-of-body= Mar 16 00:11:56 crc kubenswrapper[4816]: I0316 00:11:56.610880 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: i/o timeout" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.133520 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560332-wb8kg"] Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.134889 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.135217 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-wb8kg"] Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.138103 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.138582 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.142364 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.142858 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55wrt\" (UniqueName: \"kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt\") pod \"auto-csr-approver-29560332-wb8kg\" (UID: \"e570fb38-3e4c-4b9b-82d9-878ec6a5306f\") " pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.243301 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55wrt\" (UniqueName: \"kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt\") pod \"auto-csr-approver-29560332-wb8kg\" (UID: \"e570fb38-3e4c-4b9b-82d9-878ec6a5306f\") " pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.261790 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55wrt\" (UniqueName: \"kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt\") pod \"auto-csr-approver-29560332-wb8kg\" (UID: \"e570fb38-3e4c-4b9b-82d9-878ec6a5306f\") " pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.321322 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.362604 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:12:00 crc kubenswrapper[4816]: E0316 00:12:00.363288 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.363313 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.363622 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.366018 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.369881 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.450662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config\") pod \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.450848 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca\") pod \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.450927 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert\") pod \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.450986 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles\") pod \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.451021 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl55f\" (UniqueName: \"kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f\") pod \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.451838 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config" (OuterVolumeSpecName: "config") pod "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" (UID: "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.452229 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca" (OuterVolumeSpecName: "client-ca") pod "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" (UID: "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.452249 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" (UID: "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.456538 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.463927 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" (UID: "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.466786 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f" (OuterVolumeSpecName: "kube-api-access-hl55f") pod "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" (UID: "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7"). InnerVolumeSpecName "kube-api-access-hl55f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.552914 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.552986 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8d5j\" (UniqueName: \"kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553110 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553325 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553427 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553447 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553460 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553474 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl55f\" (UniqueName: \"kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553483 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.654493 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8d5j\" (UniqueName: \"kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.654596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.654642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.654660 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.654762 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.657259 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.657309 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.657619 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.659180 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.670001 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8d5j\" (UniqueName: \"kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.688894 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.163216 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" event={"ID":"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7","Type":"ContainerDied","Data":"29366548a9919c43dc76196eb98a6cdaf85a57ba7ece5ccd1f7de91db89bc729"} Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.163301 4816 scope.go:117] "RemoveContainer" containerID="1622414dc19bed94547791fb46aea5e67b087d0109b950fdff872fc6af3fe300" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.163332 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.199401 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.202011 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.678899 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" path="/var/lib/kubelet/pods/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7/volumes" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.862737 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.862796 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.862841 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.863456 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.863513 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c" gracePeriod=600 Mar 16 00:12:02 crc kubenswrapper[4816]: I0316 00:12:02.971415 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.112012 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir\") pod \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.112192 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" (UID: "2f10e41c-e6db-4083-bb86-ed0d39cc1a5a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.112305 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access\") pod \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.113985 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.130037 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" (UID: "2f10e41c-e6db-4083-bb86-ed0d39cc1a5a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.177766 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a","Type":"ContainerDied","Data":"934833059dddc073b4415862e96aafb4ed1091c7c9cabca244d231fa20d34d92"} Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.177807 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="934833059dddc073b4415862e96aafb4ed1091c7c9cabca244d231fa20d34d92" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.177815 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.180407 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c" exitCode=0 Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.180464 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c"} Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.215539 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.233774 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.482992 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-wb8kg"] Mar 16 00:12:03 crc kubenswrapper[4816]: W0316 00:12:03.594311 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode570fb38_3e4c_4b9b_82d9_878ec6a5306f.slice/crio-8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516 WatchSource:0}: Error finding container 8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516: Status 404 returned error can't find the container with id 8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516 Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.206313 4816 generic.go:334] "Generic (PLEG): container finished" podID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerID="ac8cc1ccd360062b03be48af7d20fc5e22baa578d5f6d15342b7e9dcce308a09" exitCode=0 Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.206392 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerDied","Data":"ac8cc1ccd360062b03be48af7d20fc5e22baa578d5f6d15342b7e9dcce308a09"} Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.210910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" event={"ID":"c682ba54-60b9-4293-ba42-dbde80524daf","Type":"ContainerStarted","Data":"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9"} Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.210954 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" event={"ID":"c682ba54-60b9-4293-ba42-dbde80524daf","Type":"ContainerStarted","Data":"3c1f3c1914f9bb0abf88eda5dfdf58f6ce50fa7199c9993ff27cf3aef4e09894"} Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.211635 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.217633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" event={"ID":"e570fb38-3e4c-4b9b-82d9-878ec6a5306f","Type":"ContainerStarted","Data":"8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516"} Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.223348 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b"} Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.234050 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.274925 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" podStartSLOduration=43.274903666 podStartE2EDuration="43.274903666s" podCreationTimestamp="2026-03-16 00:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:04.260856449 +0000 UTC m=+317.357156402" watchObservedRunningTime="2026-03-16 00:12:04.274903666 +0000 UTC m=+317.371203659" Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.753357 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.235277 4816 generic.go:334] "Generic (PLEG): container finished" podID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerID="67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.235362 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerDied","Data":"67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.238739 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff69863d-13e1-444c-ba61-6d68a509a203" containerID="74ea61ccf0b15157a029e2724ad851af86bb050fe43292729cc5d2b513ea7141" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.238813 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerDied","Data":"74ea61ccf0b15157a029e2724ad851af86bb050fe43292729cc5d2b513ea7141"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.242210 4816 generic.go:334] "Generic (PLEG): container finished" podID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerID="c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.242281 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerDied","Data":"c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.246526 4816 generic.go:334] "Generic (PLEG): container finished" podID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerID="92ce11f74b2381302bcae2babd96b3eab76e1d28bfb034c70d8b99be8178dac1" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.246596 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerDied","Data":"92ce11f74b2381302bcae2babd96b3eab76e1d28bfb034c70d8b99be8178dac1"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.256795 4816 generic.go:334] "Generic (PLEG): container finished" podID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerID="12e67fc1baf84e28d7eb14a44704825a68bd0357e121983c70625a0778be907a" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.256875 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerDied","Data":"12e67fc1baf84e28d7eb14a44704825a68bd0357e121983c70625a0778be907a"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.266588 4816 generic.go:334] "Generic (PLEG): container finished" podID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerID="43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.266846 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerDied","Data":"43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.272620 4816 generic.go:334] "Generic (PLEG): container finished" podID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerID="908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.272727 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerDied","Data":"908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.295105 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" podStartSLOduration=4.072691934 podStartE2EDuration="5.295085317s" podCreationTimestamp="2026-03-16 00:12:00 +0000 UTC" firstStartedPulling="2026-03-16 00:12:03.706500921 +0000 UTC m=+316.802800874" lastFinishedPulling="2026-03-16 00:12:04.928894294 +0000 UTC m=+318.025194257" observedRunningTime="2026-03-16 00:12:05.29411605 +0000 UTC m=+318.390416003" watchObservedRunningTime="2026-03-16 00:12:05.295085317 +0000 UTC m=+318.391385270" Mar 16 00:12:06 crc kubenswrapper[4816]: I0316 00:12:06.283142 4816 generic.go:334] "Generic (PLEG): container finished" podID="e570fb38-3e4c-4b9b-82d9-878ec6a5306f" containerID="c422afc027f6d729cf317777cce7cb5de5ed92334512743c933f67e04e4724ef" exitCode=0 Mar 16 00:12:06 crc kubenswrapper[4816]: I0316 00:12:06.283334 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" event={"ID":"e570fb38-3e4c-4b9b-82d9-878ec6a5306f","Type":"ContainerDied","Data":"c422afc027f6d729cf317777cce7cb5de5ed92334512743c933f67e04e4724ef"} Mar 16 00:12:06 crc kubenswrapper[4816]: I0316 00:12:06.286540 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerStarted","Data":"050311929b94abcb5fed29f67a5c5b0ea4b9aaa7f08d32d15808bc1dd56bc7c0"} Mar 16 00:12:06 crc kubenswrapper[4816]: I0316 00:12:06.324650 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8q2xw" podStartSLOduration=8.479037156 podStartE2EDuration="1m24.324621535s" podCreationTimestamp="2026-03-16 00:10:42 +0000 UTC" firstStartedPulling="2026-03-16 00:10:49.241831966 +0000 UTC m=+242.338131939" lastFinishedPulling="2026-03-16 00:12:05.087416365 +0000 UTC m=+318.183716318" observedRunningTime="2026-03-16 00:12:06.323016141 +0000 UTC m=+319.419316114" watchObservedRunningTime="2026-03-16 00:12:06.324621535 +0000 UTC m=+319.420921498" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.298026 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerStarted","Data":"cf2475ba248dac35fe8355d380068957ad2cdb3c9fd96c04caf12282b0bbdb1f"} Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.301531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerStarted","Data":"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2"} Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.304837 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerStarted","Data":"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc"} Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.328622 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bkxpc" podStartSLOduration=8.211844836 podStartE2EDuration="1m25.328588499s" podCreationTimestamp="2026-03-16 00:10:42 +0000 UTC" firstStartedPulling="2026-03-16 00:10:49.242420622 +0000 UTC m=+242.338720585" lastFinishedPulling="2026-03-16 00:12:06.359164305 +0000 UTC m=+319.455464248" observedRunningTime="2026-03-16 00:12:07.327751616 +0000 UTC m=+320.424051589" watchObservedRunningTime="2026-03-16 00:12:07.328588499 +0000 UTC m=+320.424888452" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.349154 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4gwcw" podStartSLOduration=3.546257089 podStartE2EDuration="1m25.349128634s" podCreationTimestamp="2026-03-16 00:10:42 +0000 UTC" firstStartedPulling="2026-03-16 00:10:44.443817187 +0000 UTC m=+237.540117140" lastFinishedPulling="2026-03-16 00:12:06.246688742 +0000 UTC m=+319.342988685" observedRunningTime="2026-03-16 00:12:07.348172958 +0000 UTC m=+320.444472931" watchObservedRunningTime="2026-03-16 00:12:07.349128634 +0000 UTC m=+320.445428587" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.373881 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-52qs6" podStartSLOduration=12.319315376 podStartE2EDuration="1m22.373856555s" podCreationTimestamp="2026-03-16 00:10:45 +0000 UTC" firstStartedPulling="2026-03-16 00:10:56.720845664 +0000 UTC m=+249.817145627" lastFinishedPulling="2026-03-16 00:12:06.775386853 +0000 UTC m=+319.871686806" observedRunningTime="2026-03-16 00:12:07.37113234 +0000 UTC m=+320.467432293" watchObservedRunningTime="2026-03-16 00:12:07.373856555 +0000 UTC m=+320.470156508" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.657132 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.809153 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55wrt\" (UniqueName: \"kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt\") pod \"e570fb38-3e4c-4b9b-82d9-878ec6a5306f\" (UID: \"e570fb38-3e4c-4b9b-82d9-878ec6a5306f\") " Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.813840 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt" (OuterVolumeSpecName: "kube-api-access-55wrt") pod "e570fb38-3e4c-4b9b-82d9-878ec6a5306f" (UID: "e570fb38-3e4c-4b9b-82d9-878ec6a5306f"). InnerVolumeSpecName "kube-api-access-55wrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.911974 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55wrt\" (UniqueName: \"kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.310266 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" event={"ID":"e570fb38-3e4c-4b9b-82d9-878ec6a5306f","Type":"ContainerDied","Data":"8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516"} Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.310317 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516" Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.310366 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.330955 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerStarted","Data":"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf"} Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.333267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerStarted","Data":"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d"} Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.351773 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7pb49" podStartSLOduration=11.679330523 podStartE2EDuration="1m24.351757713s" podCreationTimestamp="2026-03-16 00:10:44 +0000 UTC" firstStartedPulling="2026-03-16 00:10:54.620394536 +0000 UTC m=+247.716694489" lastFinishedPulling="2026-03-16 00:12:07.292821726 +0000 UTC m=+320.389121679" observedRunningTime="2026-03-16 00:12:08.347423603 +0000 UTC m=+321.443723556" watchObservedRunningTime="2026-03-16 00:12:08.351757713 +0000 UTC m=+321.448057666" Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.364036 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wh2h7" podStartSLOduration=8.055107686 podStartE2EDuration="1m26.36402418s" podCreationTimestamp="2026-03-16 00:10:42 +0000 UTC" firstStartedPulling="2026-03-16 00:10:49.24233088 +0000 UTC m=+242.338630843" lastFinishedPulling="2026-03-16 00:12:07.551247384 +0000 UTC m=+320.647547337" observedRunningTime="2026-03-16 00:12:08.361658145 +0000 UTC m=+321.457958098" watchObservedRunningTime="2026-03-16 00:12:08.36402418 +0000 UTC m=+321.460324123" Mar 16 00:12:11 crc kubenswrapper[4816]: I0316 00:12:11.361930 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerStarted","Data":"9cbc70d2e0b275d40fbacb6be14712c60796f46bdd73e4f108a004a37c120cb9"} Mar 16 00:12:12 crc kubenswrapper[4816]: I0316 00:12:12.397584 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6gbkl" podStartSLOduration=15.860012539 podStartE2EDuration="1m28.397531805s" podCreationTimestamp="2026-03-16 00:10:44 +0000 UTC" firstStartedPulling="2026-03-16 00:10:56.721284176 +0000 UTC m=+249.817584129" lastFinishedPulling="2026-03-16 00:12:09.258803442 +0000 UTC m=+322.355103395" observedRunningTime="2026-03-16 00:12:12.392344142 +0000 UTC m=+325.488644095" watchObservedRunningTime="2026-03-16 00:12:12.397531805 +0000 UTC m=+325.493831778" Mar 16 00:12:12 crc kubenswrapper[4816]: I0316 00:12:12.666374 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:12:12 crc kubenswrapper[4816]: I0316 00:12:12.666698 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.017450 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.017503 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.087437 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.088003 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.272459 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.272714 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.378142 4816 generic.go:334] "Generic (PLEG): container finished" podID="9fc59286-0388-4519-afc7-f2c8cf80ab40" containerID="a30805e487fac9e751dab1510445d1b512d8b7784f8e73df1f67f72887178e24" exitCode=0 Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.378219 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-s9q72" event={"ID":"9fc59286-0388-4519-afc7-f2c8cf80ab40","Type":"ContainerDied","Data":"a30805e487fac9e751dab1510445d1b512d8b7784f8e73df1f67f72887178e24"} Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.380930 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerStarted","Data":"8f95ead769819114b5324ad74b013a299738b066a23d9b7aab0526d5b3f15f3a"} Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.426893 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hvpqn" podStartSLOduration=11.489613704 podStartE2EDuration="1m27.426870591s" podCreationTimestamp="2026-03-16 00:10:46 +0000 UTC" firstStartedPulling="2026-03-16 00:10:56.668778093 +0000 UTC m=+249.765078046" lastFinishedPulling="2026-03-16 00:12:12.60603498 +0000 UTC m=+325.702334933" observedRunningTime="2026-03-16 00:12:13.423294753 +0000 UTC m=+326.519594706" watchObservedRunningTime="2026-03-16 00:12:13.426870591 +0000 UTC m=+326.523170544" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.429797 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.430608 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.430966 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.434330 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.482835 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.492786 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.493716 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.431923 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.655236 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.655291 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.694474 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.701426 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.816087 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca\") pod \"9fc59286-0388-4519-afc7-f2c8cf80ab40\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.816217 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tj47\" (UniqueName: \"kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47\") pod \"9fc59286-0388-4519-afc7-f2c8cf80ab40\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.817928 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca" (OuterVolumeSpecName: "serviceca") pod "9fc59286-0388-4519-afc7-f2c8cf80ab40" (UID: "9fc59286-0388-4519-afc7-f2c8cf80ab40"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.822387 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47" (OuterVolumeSpecName: "kube-api-access-2tj47") pod "9fc59286-0388-4519-afc7-f2c8cf80ab40" (UID: "9fc59286-0388-4519-afc7-f2c8cf80ab40"). InnerVolumeSpecName "kube-api-access-2tj47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.917656 4816 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.917691 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tj47\" (UniqueName: \"kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.098603 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.099541 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.148130 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.396098 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.396305 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-s9q72" event={"ID":"9fc59286-0388-4519-afc7-f2c8cf80ab40","Type":"ContainerDied","Data":"469ef439f1bc4e49165115c6fecd0f6feec675c1f680294bca4301ee3520daee"} Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.397737 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="469ef439f1bc4e49165115c6fecd0f6feec675c1f680294bca4301ee3520daee" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.452059 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.455358 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.084594 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.085722 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.134806 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.461923 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.472818 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.473191 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:12:17 crc kubenswrapper[4816]: I0316 00:12:17.473454 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:12:17 crc kubenswrapper[4816]: I0316 00:12:17.474222 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bkxpc" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="registry-server" containerID="cri-o://cf2475ba248dac35fe8355d380068957ad2cdb3c9fd96c04caf12282b0bbdb1f" gracePeriod=2 Mar 16 00:12:17 crc kubenswrapper[4816]: I0316 00:12:17.517704 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvpqn" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="registry-server" probeResult="failure" output=< Mar 16 00:12:17 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:12:17 crc kubenswrapper[4816]: > Mar 16 00:12:17 crc kubenswrapper[4816]: I0316 00:12:17.676058 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:12:17 crc kubenswrapper[4816]: I0316 00:12:17.676379 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8q2xw" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="registry-server" containerID="cri-o://050311929b94abcb5fed29f67a5c5b0ea4b9aaa7f08d32d15808bc1dd56bc7c0" gracePeriod=2 Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.439381 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff69863d-13e1-444c-ba61-6d68a509a203" containerID="cf2475ba248dac35fe8355d380068957ad2cdb3c9fd96c04caf12282b0bbdb1f" exitCode=0 Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.439599 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerDied","Data":"cf2475ba248dac35fe8355d380068957ad2cdb3c9fd96c04caf12282b0bbdb1f"} Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.719833 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.720523 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.720633 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.720727 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.722339 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.722635 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.722872 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.732793 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.738144 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.744860 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.745642 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.899160 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.984996 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.994616 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.002343 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:12:19 crc kubenswrapper[4816]: W0316 00:12:19.447562 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c40738c61b48d6c4c4636768df0e4d99df2211903494181e2eaab3802f9cdd34 WatchSource:0}: Error finding container c40738c61b48d6c4c4636768df0e4d99df2211903494181e2eaab3802f9cdd34: Status 404 returned error can't find the container with id c40738c61b48d6c4c4636768df0e4d99df2211903494181e2eaab3802f9cdd34 Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.454935 4816 generic.go:334] "Generic (PLEG): container finished" podID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerID="050311929b94abcb5fed29f67a5c5b0ea4b9aaa7f08d32d15808bc1dd56bc7c0" exitCode=0 Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.454981 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerDied","Data":"050311929b94abcb5fed29f67a5c5b0ea4b9aaa7f08d32d15808bc1dd56bc7c0"} Mar 16 00:12:19 crc kubenswrapper[4816]: W0316 00:12:19.474319 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-4ebbcb4fc0112955f2090e810b55796a7e29fd750b6da3ad533b188be904fb4c WatchSource:0}: Error finding container 4ebbcb4fc0112955f2090e810b55796a7e29fd750b6da3ad533b188be904fb4c: Status 404 returned error can't find the container with id 4ebbcb4fc0112955f2090e810b55796a7e29fd750b6da3ad533b188be904fb4c Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.580034 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:19 crc kubenswrapper[4816]: W0316 00:12:19.682736 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-d6e80ac2881be293aae0d1cf60a7526311b9ca54768daed4b468ce6957f38c33 WatchSource:0}: Error finding container d6e80ac2881be293aae0d1cf60a7526311b9ca54768daed4b468ce6957f38c33: Status 404 returned error can't find the container with id d6e80ac2881be293aae0d1cf60a7526311b9ca54768daed4b468ce6957f38c33 Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.734166 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content\") pod \"ff69863d-13e1-444c-ba61-6d68a509a203\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.734254 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities\") pod \"ff69863d-13e1-444c-ba61-6d68a509a203\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.734304 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8frh\" (UniqueName: \"kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh\") pod \"ff69863d-13e1-444c-ba61-6d68a509a203\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.735813 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities" (OuterVolumeSpecName: "utilities") pod "ff69863d-13e1-444c-ba61-6d68a509a203" (UID: "ff69863d-13e1-444c-ba61-6d68a509a203"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.739810 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh" (OuterVolumeSpecName: "kube-api-access-p8frh") pod "ff69863d-13e1-444c-ba61-6d68a509a203" (UID: "ff69863d-13e1-444c-ba61-6d68a509a203"). InnerVolumeSpecName "kube-api-access-p8frh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.836740 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.839063 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8frh\" (UniqueName: \"kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.874781 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.875058 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6gbkl" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="registry-server" containerID="cri-o://9cbc70d2e0b275d40fbacb6be14712c60796f46bdd73e4f108a004a37c120cb9" gracePeriod=2 Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.214433 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.346797 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities\") pod \"bf586c6e-f957-46fc-8140-f9a9ea22510f\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.347491 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gwq2\" (UniqueName: \"kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2\") pod \"bf586c6e-f957-46fc-8140-f9a9ea22510f\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.347593 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content\") pod \"bf586c6e-f957-46fc-8140-f9a9ea22510f\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.347827 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities" (OuterVolumeSpecName: "utilities") pod "bf586c6e-f957-46fc-8140-f9a9ea22510f" (UID: "bf586c6e-f957-46fc-8140-f9a9ea22510f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.348122 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.353149 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2" (OuterVolumeSpecName: "kube-api-access-8gwq2") pod "bf586c6e-f957-46fc-8140-f9a9ea22510f" (UID: "bf586c6e-f957-46fc-8140-f9a9ea22510f"). InnerVolumeSpecName "kube-api-access-8gwq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.448859 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gwq2\" (UniqueName: \"kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.462648 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerDied","Data":"e4ae76a3c7fcca7fb114ac9afc90c35f9554a0225d9ad44974098c92d5909906"} Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.462728 4816 scope.go:117] "RemoveContainer" containerID="050311929b94abcb5fed29f67a5c5b0ea4b9aaa7f08d32d15808bc1dd56bc7c0" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.462862 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.465208 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c40738c61b48d6c4c4636768df0e4d99df2211903494181e2eaab3802f9cdd34"} Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.467701 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d6e80ac2881be293aae0d1cf60a7526311b9ca54768daed4b468ce6957f38c33"} Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.470729 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerDied","Data":"bb49e18eedefb469d6109a46587df4a9ef4eb1e0a35954df9209a551fbb7b5b4"} Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.470848 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.479176 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4ebbcb4fc0112955f2090e810b55796a7e29fd750b6da3ad533b188be904fb4c"} Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.482055 4816 scope.go:117] "RemoveContainer" containerID="ac8cc1ccd360062b03be48af7d20fc5e22baa578d5f6d15342b7e9dcce308a09" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.510881 4816 scope.go:117] "RemoveContainer" containerID="307037ba13f42f68192fcc6d4406e472de7d9aac5f7546be49cd42537db26240" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.540785 4816 scope.go:117] "RemoveContainer" containerID="cf2475ba248dac35fe8355d380068957ad2cdb3c9fd96c04caf12282b0bbdb1f" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.552786 4816 scope.go:117] "RemoveContainer" containerID="74ea61ccf0b15157a029e2724ad851af86bb050fe43292729cc5d2b513ea7141" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.564417 4816 scope.go:117] "RemoveContainer" containerID="2d7e1ead92ce8010c6084321e28f13a3b17186a0141a9a086a18947183a41d47" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.765815 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf586c6e-f957-46fc-8140-f9a9ea22510f" (UID: "bf586c6e-f957-46fc-8140-f9a9ea22510f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.856335 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.091999 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.104081 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.318942 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff69863d-13e1-444c-ba61-6d68a509a203" (UID: "ff69863d-13e1-444c-ba61-6d68a509a203"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.361863 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.413055 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.413118 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.485440 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3ea1184f822884ab69fdfb7a5fe5136882ffc21bf737db72c6911e5ee012e8d7"} Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.487628 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"dc39a4751822c7014b25d13324976e8315a93bbe519111882b17f3808d80cdcc"} Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.499235 4816 generic.go:334] "Generic (PLEG): container finished" podID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerID="9cbc70d2e0b275d40fbacb6be14712c60796f46bdd73e4f108a004a37c120cb9" exitCode=0 Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.499309 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerDied","Data":"9cbc70d2e0b275d40fbacb6be14712c60796f46bdd73e4f108a004a37c120cb9"} Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.499337 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerDied","Data":"dfccefcb0f8e6864404f0a8715036becb9b7ec4a3aef59dca2da5e935bde36d5"} Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.499350 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfccefcb0f8e6864404f0a8715036becb9b7ec4a3aef59dca2da5e935bde36d5" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.500816 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"554f8200c740d02a2297f006a6411dc8ec1040cc94013d3eb914e9f1af3bbcc7"} Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.501419 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.542837 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.664294 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2d9d\" (UniqueName: \"kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d\") pod \"bad7b5f7-88a8-4c20-a010-734a46f59e05\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.664433 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content\") pod \"bad7b5f7-88a8-4c20-a010-734a46f59e05\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.664490 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities\") pod \"bad7b5f7-88a8-4c20-a010-734a46f59e05\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.674335 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities" (OuterVolumeSpecName: "utilities") pod "bad7b5f7-88a8-4c20-a010-734a46f59e05" (UID: "bad7b5f7-88a8-4c20-a010-734a46f59e05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.674666 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d" (OuterVolumeSpecName: "kube-api-access-l2d9d") pod "bad7b5f7-88a8-4c20-a010-734a46f59e05" (UID: "bad7b5f7-88a8-4c20-a010-734a46f59e05"). InnerVolumeSpecName "kube-api-access-l2d9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.686204 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" path="/var/lib/kubelet/pods/bf586c6e-f957-46fc-8140-f9a9ea22510f/volumes" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.687113 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" path="/var/lib/kubelet/pods/ff69863d-13e1-444c-ba61-6d68a509a203/volumes" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.712617 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bad7b5f7-88a8-4c20-a010-734a46f59e05" (UID: "bad7b5f7-88a8-4c20-a010-734a46f59e05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.766263 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2d9d\" (UniqueName: \"kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.766320 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.766335 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.507849 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.544474 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.552314 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.552714 4816 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553050 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e570fb38-3e4c-4b9b-82d9-878ec6a5306f" containerName="oc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553083 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e570fb38-3e4c-4b9b-82d9-878ec6a5306f" containerName="oc" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553104 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc59286-0388-4519-afc7-f2c8cf80ab40" containerName="image-pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553116 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc59286-0388-4519-afc7-f2c8cf80ab40" containerName="image-pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553141 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553156 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553173 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553185 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553272 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553286 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553302 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553313 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553331 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553341 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553359 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553372 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553386 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553397 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553411 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553423 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553439 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" containerName="pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553450 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" containerName="pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553464 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553475 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553743 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553758 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" containerName="pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553769 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e570fb38-3e4c-4b9b-82d9-878ec6a5306f" containerName="oc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553776 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553791 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc59286-0388-4519-afc7-f2c8cf80ab40" containerName="image-pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553798 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554201 4816 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554309 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554507 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd" gracePeriod=15 Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554586 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff" gracePeriod=15 Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554670 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80" gracePeriod=15 Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554734 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5" gracePeriod=15 Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554936 4816 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555188 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555203 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555256 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555267 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555277 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555284 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555296 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555303 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555312 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555319 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555329 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555335 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555342 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555350 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555369 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555376 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555481 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555493 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555501 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555516 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555527 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555536 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555562 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555671 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555681 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555692 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555699 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555796 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555810 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554761 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a" gracePeriod=15 Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.598761 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678293 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678346 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678415 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678438 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678466 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678557 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678591 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678620 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.779533 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.779946 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780089 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780151 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780186 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780218 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780258 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780294 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780387 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780435 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780464 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780491 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780517 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780567 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780602 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780631 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.888297 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: W0316 00:12:22.908685 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-64dc539850c7ea7966762da40da0e3a6db88dd12101785edd9669d7fb3f9a68a WatchSource:0}: Error finding container 64dc539850c7ea7966762da40da0e3a6db88dd12101785edd9669d7fb3f9a68a: Status 404 returned error can't find the container with id 64dc539850c7ea7966762da40da0e3a6db88dd12101785edd9669d7fb3f9a68a Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.912538 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29f0e7c5f931 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,LastTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.513990 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.515370 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.516057 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5" exitCode=0 Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.516082 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff" exitCode=0 Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.516091 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80" exitCode=0 Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.516098 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a" exitCode=2 Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.516156 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.517862 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4"} Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.517975 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"64dc539850c7ea7966762da40da0e3a6db88dd12101785edd9669d7fb3f9a68a"} Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.519181 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.519640 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.520893 4816 generic.go:334] "Generic (PLEG): container finished" podID="a9416716-e666-46d6-9d77-fe5c9702c035" containerID="dd1bd79082698289c067776233601bb17f6ba8cbb98dcb745b3329e5f4f6fb1f" exitCode=0 Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.521033 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a9416716-e666-46d6-9d77-fe5c9702c035","Type":"ContainerDied","Data":"dd1bd79082698289c067776233601bb17f6ba8cbb98dcb745b3329e5f4f6fb1f"} Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.521595 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.521915 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.522136 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.677730 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" path="/var/lib/kubelet/pods/bad7b5f7-88a8-4c20-a010-734a46f59e05/volumes" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.752713 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.753205 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.753448 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.753690 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.754036 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.754136 4816 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.754506 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.954929 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Mar 16 00:12:24 crc kubenswrapper[4816]: E0316 00:12:24.356089 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.529484 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.529573 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="3ea1184f822884ab69fdfb7a5fe5136882ffc21bf737db72c6911e5ee012e8d7" exitCode=255 Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.529676 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"3ea1184f822884ab69fdfb7a5fe5136882ffc21bf737db72c6911e5ee012e8d7"} Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.530258 4816 scope.go:117] "RemoveContainer" containerID="3ea1184f822884ab69fdfb7a5fe5136882ffc21bf737db72c6911e5ee012e8d7" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.530301 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.530625 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.531371 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.533604 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.078416 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.079622 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.080271 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.080605 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.080828 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.081035 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.082590 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.083228 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.083538 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.083892 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.084202 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.157359 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.168940 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169007 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169038 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock\") pod \"a9416716-e666-46d6-9d77-fe5c9702c035\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169095 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir\") pod \"a9416716-e666-46d6-9d77-fe5c9702c035\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169158 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access\") pod \"a9416716-e666-46d6-9d77-fe5c9702c035\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169089 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169084 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169117 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock" (OuterVolumeSpecName: "var-lock") pod "a9416716-e666-46d6-9d77-fe5c9702c035" (UID: "a9416716-e666-46d6-9d77-fe5c9702c035"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169125 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a9416716-e666-46d6-9d77-fe5c9702c035" (UID: "a9416716-e666-46d6-9d77-fe5c9702c035"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169264 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169349 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169579 4816 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169601 4816 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169629 4816 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169640 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169648 4816 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.178655 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a9416716-e666-46d6-9d77-fe5c9702c035" (UID: "a9416716-e666-46d6-9d77-fe5c9702c035"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.271108 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.542934 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a9416716-e666-46d6-9d77-fe5c9702c035","Type":"ContainerDied","Data":"dd148857a3f8f3853eb8381f642acb80c9aad6dc4ab5491e0ecfe89f172f60d6"} Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.542974 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.543003 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd148857a3f8f3853eb8381f642acb80c9aad6dc4ab5491e0ecfe89f172f60d6" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.546391 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.547288 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd" exitCode=0 Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.547410 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.547410 4816 scope.go:117] "RemoveContainer" containerID="4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.549946 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.551335 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.551391 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="486dacbeaae2de35a560c3abe670d11c1b0aff52bf6fcfee4e790d8493ac9b11" exitCode=255 Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.551436 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"486dacbeaae2de35a560c3abe670d11c1b0aff52bf6fcfee4e790d8493ac9b11"} Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.552491 4816 scope.go:117] "RemoveContainer" containerID="486dacbeaae2de35a560c3abe670d11c1b0aff52bf6fcfee4e790d8493ac9b11" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.552647 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.553036 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.553103 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.553391 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.553652 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.596801 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.597224 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.597807 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.599014 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.599753 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.600294 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.600702 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.601311 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.606885 4816 scope.go:117] "RemoveContainer" containerID="c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.621779 4816 scope.go:117] "RemoveContainer" containerID="da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.650635 4816 scope.go:117] "RemoveContainer" containerID="c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.665812 4816 scope.go:117] "RemoveContainer" containerID="46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.675352 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.688533 4816 scope.go:117] "RemoveContainer" containerID="0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.711458 4816 scope.go:117] "RemoveContainer" containerID="4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.712235 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\": container with ID starting with 4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5 not found: ID does not exist" containerID="4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.712285 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5"} err="failed to get container status \"4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\": rpc error: code = NotFound desc = could not find container \"4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\": container with ID starting with 4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5 not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.712315 4816 scope.go:117] "RemoveContainer" containerID="c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.712967 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\": container with ID starting with c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff not found: ID does not exist" containerID="c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.713006 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff"} err="failed to get container status \"c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\": rpc error: code = NotFound desc = could not find container \"c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\": container with ID starting with c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.713036 4816 scope.go:117] "RemoveContainer" containerID="da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.713312 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\": container with ID starting with da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80 not found: ID does not exist" containerID="da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.713331 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80"} err="failed to get container status \"da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\": rpc error: code = NotFound desc = could not find container \"da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\": container with ID starting with da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80 not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.713343 4816 scope.go:117] "RemoveContainer" containerID="c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.714100 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\": container with ID starting with c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a not found: ID does not exist" containerID="c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714121 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a"} err="failed to get container status \"c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\": rpc error: code = NotFound desc = could not find container \"c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\": container with ID starting with c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714134 4816 scope.go:117] "RemoveContainer" containerID="46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.714336 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\": container with ID starting with 46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd not found: ID does not exist" containerID="46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714354 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd"} err="failed to get container status \"46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\": rpc error: code = NotFound desc = could not find container \"46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\": container with ID starting with 46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714372 4816 scope.go:117] "RemoveContainer" containerID="0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.714587 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\": container with ID starting with 0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1 not found: ID does not exist" containerID="0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714608 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1"} err="failed to get container status \"0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\": rpc error: code = NotFound desc = could not find container \"0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\": container with ID starting with 0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1 not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714621 4816 scope.go:117] "RemoveContainer" containerID="3ea1184f822884ab69fdfb7a5fe5136882ffc21bf737db72c6911e5ee012e8d7" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.511755 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.512502 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.512895 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.513076 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.513389 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.550318 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.550909 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.551314 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.551670 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.552074 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.556540 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 16 00:12:26 crc kubenswrapper[4816]: E0316 00:12:26.569574 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29f0e7c5f931 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,LastTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:12:26 crc kubenswrapper[4816]: E0316 00:12:26.758864 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Mar 16 00:12:27 crc kubenswrapper[4816]: I0316 00:12:27.672112 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:27 crc kubenswrapper[4816]: I0316 00:12:27.672825 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:27 crc kubenswrapper[4816]: I0316 00:12:27.673069 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:27 crc kubenswrapper[4816]: I0316 00:12:27.673433 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:27 crc kubenswrapper[4816]: E0316 00:12:27.715525 4816 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" volumeName="registry-storage" Mar 16 00:12:29 crc kubenswrapper[4816]: E0316 00:12:29.960233 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="6.4s" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.612083 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.613028 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.613075 4816 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac" exitCode=1 Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.613120 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac"} Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.613720 4816 scope.go:117] "RemoveContainer" containerID="5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.614885 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.615383 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.616529 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.617334 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.617771 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: E0316 00:12:36.361251 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="7s" Mar 16 00:12:36 crc kubenswrapper[4816]: E0316 00:12:36.571020 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29f0e7c5f931 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,LastTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.624123 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.624673 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.624720 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4aa569a30f818c0b18a06057a8cfa80679949921cd47150e4a19e7cda2ca413d"} Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.626203 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.627241 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.627689 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.628167 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.628415 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.666958 4816 scope.go:117] "RemoveContainer" containerID="486dacbeaae2de35a560c3abe670d11c1b0aff52bf6fcfee4e790d8493ac9b11" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.667426 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.667886 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.668323 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.668587 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.668847 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.634509 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.635662 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.635715 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="bf90d2b66ba96bda100b2d7e60bddd86b4409c573e87857f65bb7785355097cf" exitCode=255 Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.635748 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"bf90d2b66ba96bda100b2d7e60bddd86b4409c573e87857f65bb7785355097cf"} Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.635788 4816 scope.go:117] "RemoveContainer" containerID="486dacbeaae2de35a560c3abe670d11c1b0aff52bf6fcfee4e790d8493ac9b11" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.636267 4816 scope.go:117] "RemoveContainer" containerID="bf90d2b66ba96bda100b2d7e60bddd86b4409c573e87857f65bb7785355097cf" Mar 16 00:12:37 crc kubenswrapper[4816]: E0316 00:12:37.636521 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.639425 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.639871 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.640403 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.640758 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.640999 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.667762 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.675069 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.677916 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.678395 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.679021 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.679428 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.679858 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.680229 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.680676 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.681108 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.681470 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.717142 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.717172 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:37 crc kubenswrapper[4816]: E0316 00:12:37.717481 4816 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.717899 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:37 crc kubenswrapper[4816]: W0316 00:12:37.747591 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-6672936e2d6cfd90abf352ef7108e1954aed5c372ffd62b1a57c84a375772f32 WatchSource:0}: Error finding container 6672936e2d6cfd90abf352ef7108e1954aed5c372ffd62b1a57c84a375772f32: Status 404 returned error can't find the container with id 6672936e2d6cfd90abf352ef7108e1954aed5c372ffd62b1a57c84a375772f32 Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.645050 4816 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="a2123cc741523f2b46f75dea80db3f916696a6a9cdfe0d6979eab60ea890fc6e" exitCode=0 Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.645167 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"a2123cc741523f2b46f75dea80db3f916696a6a9cdfe0d6979eab60ea890fc6e"} Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.645603 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6672936e2d6cfd90abf352ef7108e1954aed5c372ffd62b1a57c84a375772f32"} Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.646161 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.646198 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.646775 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:38 crc kubenswrapper[4816]: E0316 00:12:38.646917 4816 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.647233 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.647730 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.648167 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.648678 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.649356 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.727753 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:39 crc kubenswrapper[4816]: I0316 00:12:39.674226 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"40545204d92772841c657d6eeb43c327aa4555b2f6a8d8d3c8bc781fc6d46131"} Mar 16 00:12:39 crc kubenswrapper[4816]: I0316 00:12:39.674494 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"85b4a9ece3a249bf5d9cd638d77f53ce85d4536ea7bc057b1d9c968457155e86"} Mar 16 00:12:39 crc kubenswrapper[4816]: I0316 00:12:39.674506 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a328b262655f75aad03f269c24749d7fbbf17a17c55477a659d8bdd35a3a18de"} Mar 16 00:12:39 crc kubenswrapper[4816]: I0316 00:12:39.674516 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fae10584beeafc12e6ee60bc51600e75d592acdaf0bfcb1d6909e9e887d33af8"} Mar 16 00:12:40 crc kubenswrapper[4816]: I0316 00:12:40.680566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"96fece39acca2cfb04562648fd9193a7dd8eadc42c788da332138e3a4ca4f8cc"} Mar 16 00:12:40 crc kubenswrapper[4816]: I0316 00:12:40.680896 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:40 crc kubenswrapper[4816]: I0316 00:12:40.681009 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:40 crc kubenswrapper[4816]: I0316 00:12:40.681049 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.583070 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.583273 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.583392 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.717995 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.718057 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.723504 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:45 crc kubenswrapper[4816]: I0316 00:12:45.698124 4816 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:46 crc kubenswrapper[4816]: I0316 00:12:46.717735 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:46 crc kubenswrapper[4816]: I0316 00:12:46.718137 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:46 crc kubenswrapper[4816]: I0316 00:12:46.726771 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:47 crc kubenswrapper[4816]: I0316 00:12:47.701380 4816 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="99986f4e-b436-455c-9692-380f895c4832" Mar 16 00:12:47 crc kubenswrapper[4816]: I0316 00:12:47.722594 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:47 crc kubenswrapper[4816]: I0316 00:12:47.722625 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:47 crc kubenswrapper[4816]: I0316 00:12:47.725523 4816 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="99986f4e-b436-455c-9692-380f895c4832" Mar 16 00:12:51 crc kubenswrapper[4816]: I0316 00:12:51.668338 4816 scope.go:117] "RemoveContainer" containerID="bf90d2b66ba96bda100b2d7e60bddd86b4409c573e87857f65bb7785355097cf" Mar 16 00:12:51 crc kubenswrapper[4816]: E0316 00:12:51.669295 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:12:52 crc kubenswrapper[4816]: I0316 00:12:52.587677 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:52 crc kubenswrapper[4816]: I0316 00:12:52.592389 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:54 crc kubenswrapper[4816]: I0316 00:12:54.875399 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 16 00:12:55 crc kubenswrapper[4816]: I0316 00:12:55.144365 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 16 00:12:55 crc kubenswrapper[4816]: I0316 00:12:55.229500 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 16 00:12:55 crc kubenswrapper[4816]: I0316 00:12:55.698590 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 16 00:12:55 crc kubenswrapper[4816]: I0316 00:12:55.850469 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 16 00:12:56 crc kubenswrapper[4816]: I0316 00:12:56.244838 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 16 00:12:56 crc kubenswrapper[4816]: I0316 00:12:56.412797 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 16 00:12:56 crc kubenswrapper[4816]: I0316 00:12:56.650228 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 16 00:12:56 crc kubenswrapper[4816]: I0316 00:12:56.769098 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.041016 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.107815 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.112976 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.322264 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.620512 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.668717 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.800851 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.002282 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.049798 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.058402 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.209265 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.389360 4816 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.546452 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.627097 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.673690 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.680877 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.720381 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.753568 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.789520 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.910029 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.949402 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.998266 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.075093 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.152434 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.162427 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.166654 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.230756 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.233628 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.235899 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.285787 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.334950 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.336934 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.353699 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.365790 4816 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.370982 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.420613 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.425010 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.485422 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.641114 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.684106 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.729678 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.749987 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.836892 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.867335 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.953258 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.988606 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.107559 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.126980 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.163535 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.202358 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.341944 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.403428 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.409652 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.428632 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.431658 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.431885 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.460816 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.489196 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.502769 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.531199 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.537573 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.591502 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.608688 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.783319 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.835129 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.903492 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.909339 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.987613 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.998829 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.076682 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.107998 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.159184 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.183400 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.331719 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.359909 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.430561 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.474699 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.487698 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.506711 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.521317 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.535028 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.578993 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.607265 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.608827 4816 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.740841 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.967745 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.015530 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.123954 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.149431 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.165201 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.344955 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.345172 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.365856 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.407295 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.409473 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.502808 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.540702 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.572172 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.597166 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.598058 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.616647 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.629393 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.670688 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.698246 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.726015 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.741859 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.826866 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.921133 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.938642 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.076526 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.081173 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.100167 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.129021 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.132148 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.159790 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.309848 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.310926 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.482314 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.527499 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.530916 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.666431 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.667230 4816 scope.go:117] "RemoveContainer" containerID="bf90d2b66ba96bda100b2d7e60bddd86b4409c573e87857f65bb7785355097cf" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.678584 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.699293 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.740535 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.754462 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.778638 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.819680 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.819749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9ac66afe7f0f3082bc91e3215ea8df4639682a9c3ef140496b94009ae3f373e1"} Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.885353 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.007701 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.054667 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.107461 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.162038 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.252449 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.552481 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.773528 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.781908 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.848827 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.879262 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.970216 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.034028 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.050708 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.057489 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.077491 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.145819 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.261340 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.314988 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.372542 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.510438 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.545375 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.572030 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.580775 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.630995 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.642934 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.697208 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.702747 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.749388 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.784004 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.820245 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.843342 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.885365 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.930832 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.073437 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.135816 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.141461 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.192906 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.223912 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.224483 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.377800 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.437920 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.450050 4816 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.560662 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.569649 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.652512 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.666138 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.705954 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.759214 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.808868 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.938119 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.979771 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.980665 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.997936 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.998679 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.072147 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.212893 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.324405 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.477402 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.499724 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.542519 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.575429 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.639992 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.655251 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.660113 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.705655 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.762756 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.797639 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.837704 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.854005 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.040946 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.086944 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.173245 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.448790 4816 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.522001 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.522359 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.638514 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.638708 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.664252 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.754947 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.920830 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.007419 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.011839 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.081697 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.091143 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.121558 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.272910 4816 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.276669 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=47.276653714 podStartE2EDuration="47.276653714s" podCreationTimestamp="2026-03-16 00:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:45.778377683 +0000 UTC m=+358.874677656" watchObservedRunningTime="2026-03-16 00:13:09.276653714 +0000 UTC m=+382.372953667" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.277138 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.277171 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.283756 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.284625 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.302287 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.302273077 podStartE2EDuration="24.302273077s" podCreationTimestamp="2026-03-16 00:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:09.301032149 +0000 UTC m=+382.397332102" watchObservedRunningTime="2026-03-16 00:13:09.302273077 +0000 UTC m=+382.398573030" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.372777 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.395795 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.505738 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.541266 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.738280 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.738794 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.861349 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.980453 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.025583 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.294457 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.345104 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.701584 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.815657 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.936990 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 16 00:13:11 crc kubenswrapper[4816]: I0316 00:13:11.035374 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 16 00:13:11 crc kubenswrapper[4816]: I0316 00:13:11.047008 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 16 00:13:11 crc kubenswrapper[4816]: I0316 00:13:11.189059 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 16 00:13:11 crc kubenswrapper[4816]: I0316 00:13:11.451743 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 16 00:13:19 crc kubenswrapper[4816]: I0316 00:13:19.502025 4816 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:13:19 crc kubenswrapper[4816]: I0316 00:13:19.503038 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4" gracePeriod=5 Mar 16 00:13:21 crc kubenswrapper[4816]: I0316 00:13:21.887272 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:13:21 crc kubenswrapper[4816]: I0316 00:13:21.887486 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" podUID="c682ba54-60b9-4293-ba42-dbde80524daf" containerName="controller-manager" containerID="cri-o://7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9" gracePeriod=30 Mar 16 00:13:21 crc kubenswrapper[4816]: I0316 00:13:21.988148 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:13:21 crc kubenswrapper[4816]: I0316 00:13:21.988659 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" podUID="d843d76b-9317-42aa-848b-e3e11c3106cb" containerName="route-controller-manager" containerID="cri-o://5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634" gracePeriod=30 Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.249173 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.307832 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351000 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8d5j\" (UniqueName: \"kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j\") pod \"c682ba54-60b9-4293-ba42-dbde80524daf\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351053 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert\") pod \"c682ba54-60b9-4293-ba42-dbde80524daf\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351112 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles\") pod \"c682ba54-60b9-4293-ba42-dbde80524daf\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351260 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca\") pod \"c682ba54-60b9-4293-ba42-dbde80524daf\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351323 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snc4z\" (UniqueName: \"kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z\") pod \"d843d76b-9317-42aa-848b-e3e11c3106cb\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351382 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config\") pod \"c682ba54-60b9-4293-ba42-dbde80524daf\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351419 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config\") pod \"d843d76b-9317-42aa-848b-e3e11c3106cb\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351884 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca" (OuterVolumeSpecName: "client-ca") pod "c682ba54-60b9-4293-ba42-dbde80524daf" (UID: "c682ba54-60b9-4293-ba42-dbde80524daf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.352093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c682ba54-60b9-4293-ba42-dbde80524daf" (UID: "c682ba54-60b9-4293-ba42-dbde80524daf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.352273 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config" (OuterVolumeSpecName: "config") pod "c682ba54-60b9-4293-ba42-dbde80524daf" (UID: "c682ba54-60b9-4293-ba42-dbde80524daf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.352451 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config" (OuterVolumeSpecName: "config") pod "d843d76b-9317-42aa-848b-e3e11c3106cb" (UID: "d843d76b-9317-42aa-848b-e3e11c3106cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.356324 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c682ba54-60b9-4293-ba42-dbde80524daf" (UID: "c682ba54-60b9-4293-ba42-dbde80524daf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.356358 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z" (OuterVolumeSpecName: "kube-api-access-snc4z") pod "d843d76b-9317-42aa-848b-e3e11c3106cb" (UID: "d843d76b-9317-42aa-848b-e3e11c3106cb"). InnerVolumeSpecName "kube-api-access-snc4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.356482 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j" (OuterVolumeSpecName: "kube-api-access-z8d5j") pod "c682ba54-60b9-4293-ba42-dbde80524daf" (UID: "c682ba54-60b9-4293-ba42-dbde80524daf"). InnerVolumeSpecName "kube-api-access-z8d5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert\") pod \"d843d76b-9317-42aa-848b-e3e11c3106cb\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452842 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca\") pod \"d843d76b-9317-42aa-848b-e3e11c3106cb\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452970 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452980 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452989 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8d5j\" (UniqueName: \"kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452999 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.453008 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.453017 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.453025 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snc4z\" (UniqueName: \"kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.453552 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca" (OuterVolumeSpecName: "client-ca") pod "d843d76b-9317-42aa-848b-e3e11c3106cb" (UID: "d843d76b-9317-42aa-848b-e3e11c3106cb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.456509 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d843d76b-9317-42aa-848b-e3e11c3106cb" (UID: "d843d76b-9317-42aa-848b-e3e11c3106cb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.554264 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.554292 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.615055 4816 generic.go:334] "Generic (PLEG): container finished" podID="c682ba54-60b9-4293-ba42-dbde80524daf" containerID="7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9" exitCode=0 Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.615121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" event={"ID":"c682ba54-60b9-4293-ba42-dbde80524daf","Type":"ContainerDied","Data":"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9"} Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.615148 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" event={"ID":"c682ba54-60b9-4293-ba42-dbde80524daf","Type":"ContainerDied","Data":"3c1f3c1914f9bb0abf88eda5dfdf58f6ce50fa7199c9993ff27cf3aef4e09894"} Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.615164 4816 scope.go:117] "RemoveContainer" containerID="7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.615196 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.625804 4816 generic.go:334] "Generic (PLEG): container finished" podID="d843d76b-9317-42aa-848b-e3e11c3106cb" containerID="5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634" exitCode=0 Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.625844 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" event={"ID":"d843d76b-9317-42aa-848b-e3e11c3106cb","Type":"ContainerDied","Data":"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634"} Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.625888 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" event={"ID":"d843d76b-9317-42aa-848b-e3e11c3106cb","Type":"ContainerDied","Data":"f342289f8f13f5d89f00dac92a6b213282ffa583b6c1a48b772dae90dc55fd82"} Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.626056 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.646017 4816 scope.go:117] "RemoveContainer" containerID="7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9" Mar 16 00:13:22 crc kubenswrapper[4816]: E0316 00:13:22.646513 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9\": container with ID starting with 7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9 not found: ID does not exist" containerID="7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.646541 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9"} err="failed to get container status \"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9\": rpc error: code = NotFound desc = could not find container \"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9\": container with ID starting with 7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9 not found: ID does not exist" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.646576 4816 scope.go:117] "RemoveContainer" containerID="5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.660948 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.665823 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.671421 4816 scope.go:117] "RemoveContainer" containerID="5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634" Mar 16 00:13:22 crc kubenswrapper[4816]: E0316 00:13:22.672063 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634\": container with ID starting with 5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634 not found: ID does not exist" containerID="5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.672134 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634"} err="failed to get container status \"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634\": rpc error: code = NotFound desc = could not find container \"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634\": container with ID starting with 5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634 not found: ID does not exist" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.686271 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.694687 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.454078 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.484791 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:23 crc kubenswrapper[4816]: E0316 00:13:23.485324 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c682ba54-60b9-4293-ba42-dbde80524daf" containerName="controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.485415 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c682ba54-60b9-4293-ba42-dbde80524daf" containerName="controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: E0316 00:13:23.485505 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" containerName="installer" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.485606 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" containerName="installer" Mar 16 00:13:23 crc kubenswrapper[4816]: E0316 00:13:23.485759 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d843d76b-9317-42aa-848b-e3e11c3106cb" containerName="route-controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.485854 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d843d76b-9317-42aa-848b-e3e11c3106cb" containerName="route-controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: E0316 00:13:23.485952 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.486025 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.486243 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.486336 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d843d76b-9317-42aa-848b-e3e11c3106cb" containerName="route-controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.486427 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c682ba54-60b9-4293-ba42-dbde80524daf" containerName="controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.486532 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" containerName="installer" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.487082 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.488835 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.489522 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.491465 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.494998 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.495084 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.495154 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.495513 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.496028 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.496275 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.496890 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.497414 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.497747 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.498125 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.501725 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.502290 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.509680 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.515534 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568163 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl2vf\" (UniqueName: \"kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568229 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568262 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt8nr\" (UniqueName: \"kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568342 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568369 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568389 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568459 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568607 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669401 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl2vf\" (UniqueName: \"kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669489 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669531 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt8nr\" (UniqueName: \"kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669605 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669753 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669780 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669818 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669860 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.671261 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.675858 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.676527 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.676703 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c682ba54-60b9-4293-ba42-dbde80524daf" path="/var/lib/kubelet/pods/c682ba54-60b9-4293-ba42-dbde80524daf/volumes" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.677423 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d843d76b-9317-42aa-848b-e3e11c3106cb" path="/var/lib/kubelet/pods/d843d76b-9317-42aa-848b-e3e11c3106cb/volumes" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.681345 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.683190 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.685525 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.686811 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.690135 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl2vf\" (UniqueName: \"kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.693506 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt8nr\" (UniqueName: \"kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.818454 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.841599 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.023519 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.089883 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:24 crc kubenswrapper[4816]: W0316 00:13:24.098445 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fba9f52_b53b_4cbf_8bfb_0fe0938048c4.slice/crio-0a59117cef9e6d9e6822a03d57d6d20cce4fe9b81cd544f8d6df885b7a59aaf4 WatchSource:0}: Error finding container 0a59117cef9e6d9e6822a03d57d6d20cce4fe9b81cd544f8d6df885b7a59aaf4: Status 404 returned error can't find the container with id 0a59117cef9e6d9e6822a03d57d6d20cce4fe9b81cd544f8d6df885b7a59aaf4 Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.620942 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.621006 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.639526 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" event={"ID":"9bbefecc-c220-48cf-b69a-61571c8ad48f","Type":"ContainerStarted","Data":"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c"} Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.639603 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" event={"ID":"9bbefecc-c220-48cf-b69a-61571c8ad48f","Type":"ContainerStarted","Data":"03fd2c051d734f0a0c50d6dc9512f0c74423439d3b3fa7ab8edc636db3d5dc09"} Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.639627 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.641952 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.642011 4816 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4" exitCode=137 Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.642086 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.642092 4816 scope.go:117] "RemoveContainer" containerID="eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.644039 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" event={"ID":"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4","Type":"ContainerStarted","Data":"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec"} Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.644101 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" event={"ID":"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4","Type":"ContainerStarted","Data":"0a59117cef9e6d9e6822a03d57d6d20cce4fe9b81cd544f8d6df885b7a59aaf4"} Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.644397 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.645731 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.649386 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.656462 4816 scope.go:117] "RemoveContainer" containerID="eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4" Mar 16 00:13:24 crc kubenswrapper[4816]: E0316 00:13:24.659854 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4\": container with ID starting with eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4 not found: ID does not exist" containerID="eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.659894 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4"} err="failed to get container status \"eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4\": rpc error: code = NotFound desc = could not find container \"eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4\": container with ID starting with eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4 not found: ID does not exist" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.672833 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" podStartSLOduration=3.672813867 podStartE2EDuration="3.672813867s" podCreationTimestamp="2026-03-16 00:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:24.669308815 +0000 UTC m=+397.765608778" watchObservedRunningTime="2026-03-16 00:13:24.672813867 +0000 UTC m=+397.769113820" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.695209 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" podStartSLOduration=2.6951935110000003 podStartE2EDuration="2.695193511s" podCreationTimestamp="2026-03-16 00:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:24.693758249 +0000 UTC m=+397.790058202" watchObservedRunningTime="2026-03-16 00:13:24.695193511 +0000 UTC m=+397.791493464" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783032 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783098 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783123 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783159 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783202 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783207 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783548 4816 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783812 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.784261 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.784576 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.792535 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.884428 4816 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.884466 4816 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.884476 4816 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.884490 4816 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.608478 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.673608 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.673859 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.684226 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.684272 4816 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9aaafc3b-fd98-4f5e-9553-1c09a26bfc4e" Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.687698 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.687738 4816 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9aaafc3b-fd98-4f5e-9553-1c09a26bfc4e" Mar 16 00:13:26 crc kubenswrapper[4816]: I0316 00:13:26.007919 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 16 00:13:28 crc kubenswrapper[4816]: I0316 00:13:28.444252 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 16 00:13:29 crc kubenswrapper[4816]: I0316 00:13:29.052953 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 16 00:13:29 crc kubenswrapper[4816]: I0316 00:13:29.298583 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 16 00:13:32 crc kubenswrapper[4816]: I0316 00:13:32.672788 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 16 00:13:35 crc kubenswrapper[4816]: I0316 00:13:35.360021 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 16 00:13:35 crc kubenswrapper[4816]: I0316 00:13:35.458947 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 16 00:13:40 crc kubenswrapper[4816]: I0316 00:13:40.788219 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 16 00:13:41 crc kubenswrapper[4816]: I0316 00:13:41.928810 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:41 crc kubenswrapper[4816]: I0316 00:13:41.929848 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" podUID="9bbefecc-c220-48cf-b69a-61571c8ad48f" containerName="controller-manager" containerID="cri-o://f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c" gracePeriod=30 Mar 16 00:13:41 crc kubenswrapper[4816]: I0316 00:13:41.952171 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:41 crc kubenswrapper[4816]: I0316 00:13:41.952598 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" podUID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" containerName="route-controller-manager" containerID="cri-o://f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec" gracePeriod=30 Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.440007 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.505448 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt8nr\" (UniqueName: \"kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr\") pod \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.505518 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert\") pod \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.505619 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config\") pod \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.505668 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca\") pod \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.506337 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca" (OuterVolumeSpecName: "client-ca") pod "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" (UID: "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.506411 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config" (OuterVolumeSpecName: "config") pod "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" (UID: "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.512733 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" (UID: "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.525355 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr" (OuterVolumeSpecName: "kube-api-access-bt8nr") pod "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" (UID: "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4"). InnerVolumeSpecName "kube-api-access-bt8nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.551894 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.606567 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl2vf\" (UniqueName: \"kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf\") pod \"9bbefecc-c220-48cf-b69a-61571c8ad48f\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.606613 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config\") pod \"9bbefecc-c220-48cf-b69a-61571c8ad48f\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.606639 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert\") pod \"9bbefecc-c220-48cf-b69a-61571c8ad48f\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.606703 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles\") pod \"9bbefecc-c220-48cf-b69a-61571c8ad48f\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.606719 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca\") pod \"9bbefecc-c220-48cf-b69a-61571c8ad48f\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607413 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9bbefecc-c220-48cf-b69a-61571c8ad48f" (UID: "9bbefecc-c220-48cf-b69a-61571c8ad48f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607427 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca" (OuterVolumeSpecName: "client-ca") pod "9bbefecc-c220-48cf-b69a-61571c8ad48f" (UID: "9bbefecc-c220-48cf-b69a-61571c8ad48f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config" (OuterVolumeSpecName: "config") pod "9bbefecc-c220-48cf-b69a-61571c8ad48f" (UID: "9bbefecc-c220-48cf-b69a-61571c8ad48f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607672 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607694 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607706 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607717 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607727 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607738 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt8nr\" (UniqueName: \"kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607748 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.610445 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9bbefecc-c220-48cf-b69a-61571c8ad48f" (UID: "9bbefecc-c220-48cf-b69a-61571c8ad48f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.614491 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.615324 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf" (OuterVolumeSpecName: "kube-api-access-xl2vf") pod "9bbefecc-c220-48cf-b69a-61571c8ad48f" (UID: "9bbefecc-c220-48cf-b69a-61571c8ad48f"). InnerVolumeSpecName "kube-api-access-xl2vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.709340 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl2vf\" (UniqueName: \"kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.709379 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.764923 4816 generic.go:334] "Generic (PLEG): container finished" podID="9bbefecc-c220-48cf-b69a-61571c8ad48f" containerID="f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c" exitCode=0 Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.765010 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" event={"ID":"9bbefecc-c220-48cf-b69a-61571c8ad48f","Type":"ContainerDied","Data":"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c"} Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.765043 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" event={"ID":"9bbefecc-c220-48cf-b69a-61571c8ad48f","Type":"ContainerDied","Data":"03fd2c051d734f0a0c50d6dc9512f0c74423439d3b3fa7ab8edc636db3d5dc09"} Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.765063 4816 scope.go:117] "RemoveContainer" containerID="f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.765203 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.772642 4816 generic.go:334] "Generic (PLEG): container finished" podID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" containerID="f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec" exitCode=0 Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.772897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" event={"ID":"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4","Type":"ContainerDied","Data":"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec"} Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.772924 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" event={"ID":"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4","Type":"ContainerDied","Data":"0a59117cef9e6d9e6822a03d57d6d20cce4fe9b81cd544f8d6df885b7a59aaf4"} Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.773011 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.785404 4816 scope.go:117] "RemoveContainer" containerID="f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c" Mar 16 00:13:42 crc kubenswrapper[4816]: E0316 00:13:42.785909 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c\": container with ID starting with f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c not found: ID does not exist" containerID="f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.785960 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c"} err="failed to get container status \"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c\": rpc error: code = NotFound desc = could not find container \"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c\": container with ID starting with f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c not found: ID does not exist" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.786189 4816 scope.go:117] "RemoveContainer" containerID="f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.800668 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.804059 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.819695 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.822049 4816 scope.go:117] "RemoveContainer" containerID="f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec" Mar 16 00:13:42 crc kubenswrapper[4816]: E0316 00:13:42.822533 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec\": container with ID starting with f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec not found: ID does not exist" containerID="f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.822692 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec"} err="failed to get container status \"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec\": rpc error: code = NotFound desc = could not find container \"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec\": container with ID starting with f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec not found: ID does not exist" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.825548 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.501723 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:13:43 crc kubenswrapper[4816]: E0316 00:13:43.502001 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" containerName="route-controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.502014 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" containerName="route-controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: E0316 00:13:43.502029 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbefecc-c220-48cf-b69a-61571c8ad48f" containerName="controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.502035 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbefecc-c220-48cf-b69a-61571c8ad48f" containerName="controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.502154 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" containerName="route-controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.502167 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbefecc-c220-48cf-b69a-61571c8ad48f" containerName="controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.502618 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.503584 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.504298 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.506762 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.506798 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.509376 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.510987 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.510996 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.511012 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.510993 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.511209 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.513713 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.514108 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.514139 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.514259 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.518166 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519504 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519566 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519684 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khgb4\" (UniqueName: \"kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519778 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519802 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519864 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519904 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519947 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tf9t\" (UniqueName: \"kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.520044 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.522332 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621746 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621813 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621853 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621902 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khgb4\" (UniqueName: \"kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621938 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621968 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.622016 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.622043 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tf9t\" (UniqueName: \"kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.623055 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.623087 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.623132 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.623150 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.624094 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.632663 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.634144 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.640269 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tf9t\" (UniqueName: \"kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.640468 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khgb4\" (UniqueName: \"kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.673766 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" path="/var/lib/kubelet/pods/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4/volumes" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.674327 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bbefecc-c220-48cf-b69a-61571c8ad48f" path="/var/lib/kubelet/pods/9bbefecc-c220-48cf-b69a-61571c8ad48f/volumes" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.822018 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.845615 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.068148 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.235817 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:13:44 crc kubenswrapper[4816]: W0316 00:13:44.240738 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab40c56_2ac9_49f5_8370_ff1ae7c5757f.slice/crio-916114dd72baadc2e3d1c4e882df4092bba32ea74a936ec4e52471a9ade09699 WatchSource:0}: Error finding container 916114dd72baadc2e3d1c4e882df4092bba32ea74a936ec4e52471a9ade09699: Status 404 returned error can't find the container with id 916114dd72baadc2e3d1c4e882df4092bba32ea74a936ec4e52471a9ade09699 Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.625402 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.625983 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hvpqn" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="registry-server" containerID="cri-o://8f95ead769819114b5324ad74b013a299738b066a23d9b7aab0526d5b3f15f3a" gracePeriod=2 Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.786902 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" event={"ID":"14521451-81b6-4214-883a-cd05a9357517","Type":"ContainerStarted","Data":"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4"} Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.786947 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" event={"ID":"14521451-81b6-4214-883a-cd05a9357517","Type":"ContainerStarted","Data":"7fd72c6bda85d63e3016f0f126428c17855b91cdeed58a7acc5610da9342d4f1"} Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.787184 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.789252 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" event={"ID":"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f","Type":"ContainerStarted","Data":"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a"} Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.789294 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" event={"ID":"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f","Type":"ContainerStarted","Data":"916114dd72baadc2e3d1c4e882df4092bba32ea74a936ec4e52471a9ade09699"} Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.789407 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.791133 4816 generic.go:334] "Generic (PLEG): container finished" podID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerID="8f95ead769819114b5324ad74b013a299738b066a23d9b7aab0526d5b3f15f3a" exitCode=0 Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.791158 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerDied","Data":"8f95ead769819114b5324ad74b013a299738b066a23d9b7aab0526d5b3f15f3a"} Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.793311 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.808199 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" podStartSLOduration=3.808180665 podStartE2EDuration="3.808180665s" podCreationTimestamp="2026-03-16 00:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:44.804960631 +0000 UTC m=+417.901260584" watchObservedRunningTime="2026-03-16 00:13:44.808180665 +0000 UTC m=+417.904480618" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.968672 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.998106 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" podStartSLOduration=3.9980854409999997 podStartE2EDuration="3.998085441s" podCreationTimestamp="2026-03-16 00:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:44.845992929 +0000 UTC m=+417.942292882" watchObservedRunningTime="2026-03-16 00:13:44.998085441 +0000 UTC m=+418.094385394" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.132703 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.242912 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.248610 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities\") pod \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.248674 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content\") pod \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.248733 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk7j9\" (UniqueName: \"kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9\") pod \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.249582 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities" (OuterVolumeSpecName: "utilities") pod "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" (UID: "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.256402 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9" (OuterVolumeSpecName: "kube-api-access-qk7j9") pod "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" (UID: "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d"). InnerVolumeSpecName "kube-api-access-qk7j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.350533 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk7j9\" (UniqueName: \"kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.350580 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.383233 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" (UID: "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.451417 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.797311 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerDied","Data":"a7d840d19860a5867af8d4206630041069552968b0c74710a21974d2b8f8f661"} Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.797952 4816 scope.go:117] "RemoveContainer" containerID="8f95ead769819114b5324ad74b013a299738b066a23d9b7aab0526d5b3f15f3a" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.797355 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.812670 4816 scope.go:117] "RemoveContainer" containerID="12e67fc1baf84e28d7eb14a44704825a68bd0357e121983c70625a0778be907a" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.815816 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.820085 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.830034 4816 scope.go:117] "RemoveContainer" containerID="d3d02136defedca51b696822546773a5d6f3e05f0581bc5504bae4a17393efcc" Mar 16 00:13:47 crc kubenswrapper[4816]: I0316 00:13:47.675897 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" path="/var/lib/kubelet/pods/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d/volumes" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.172919 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560334-7sx8j"] Mar 16 00:14:00 crc kubenswrapper[4816]: E0316 00:14:00.173807 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="registry-server" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.173824 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="registry-server" Mar 16 00:14:00 crc kubenswrapper[4816]: E0316 00:14:00.173841 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="extract-utilities" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.173849 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="extract-utilities" Mar 16 00:14:00 crc kubenswrapper[4816]: E0316 00:14:00.173861 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="extract-content" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.173869 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="extract-content" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.174005 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="registry-server" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.180756 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-7sx8j"] Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.180862 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.184460 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.184704 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.184923 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.227200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5d8q\" (UniqueName: \"kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q\") pod \"auto-csr-approver-29560334-7sx8j\" (UID: \"5160d394-3d9b-4066-9bea-b9dd787b2a42\") " pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.328324 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5d8q\" (UniqueName: \"kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q\") pod \"auto-csr-approver-29560334-7sx8j\" (UID: \"5160d394-3d9b-4066-9bea-b9dd787b2a42\") " pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.360636 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5d8q\" (UniqueName: \"kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q\") pod \"auto-csr-approver-29560334-7sx8j\" (UID: \"5160d394-3d9b-4066-9bea-b9dd787b2a42\") " pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.498004 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.913258 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-7sx8j"] Mar 16 00:14:00 crc kubenswrapper[4816]: W0316 00:14:00.924757 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5160d394_3d9b_4066_9bea_b9dd787b2a42.slice/crio-a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8 WatchSource:0}: Error finding container a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8: Status 404 returned error can't find the container with id a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8 Mar 16 00:14:01 crc kubenswrapper[4816]: I0316 00:14:01.887444 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" event={"ID":"5160d394-3d9b-4066-9bea-b9dd787b2a42","Type":"ContainerStarted","Data":"a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8"} Mar 16 00:14:01 crc kubenswrapper[4816]: I0316 00:14:01.942960 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:14:01 crc kubenswrapper[4816]: I0316 00:14:01.943452 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" podUID="14521451-81b6-4214-883a-cd05a9357517" containerName="controller-manager" containerID="cri-o://83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4" gracePeriod=30 Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.039974 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.040464 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" podUID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" containerName="route-controller-manager" containerID="cri-o://6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a" gracePeriod=30 Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.519610 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.531943 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.555689 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca\") pod \"14521451-81b6-4214-883a-cd05a9357517\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.555793 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tf9t\" (UniqueName: \"kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t\") pod \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.555857 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert\") pod \"14521451-81b6-4214-883a-cd05a9357517\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.555889 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config\") pod \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.555962 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca\") pod \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.556001 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config\") pod \"14521451-81b6-4214-883a-cd05a9357517\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.556032 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles\") pod \"14521451-81b6-4214-883a-cd05a9357517\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.556067 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khgb4\" (UniqueName: \"kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4\") pod \"14521451-81b6-4214-883a-cd05a9357517\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.556122 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert\") pod \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.557497 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca" (OuterVolumeSpecName: "client-ca") pod "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" (UID: "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.558845 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config" (OuterVolumeSpecName: "config") pod "14521451-81b6-4214-883a-cd05a9357517" (UID: "14521451-81b6-4214-883a-cd05a9357517"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.559305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config" (OuterVolumeSpecName: "config") pod "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" (UID: "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.560025 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "14521451-81b6-4214-883a-cd05a9357517" (UID: "14521451-81b6-4214-883a-cd05a9357517"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.562089 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca" (OuterVolumeSpecName: "client-ca") pod "14521451-81b6-4214-883a-cd05a9357517" (UID: "14521451-81b6-4214-883a-cd05a9357517"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.562707 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" (UID: "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.565001 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t" (OuterVolumeSpecName: "kube-api-access-8tf9t") pod "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" (UID: "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f"). InnerVolumeSpecName "kube-api-access-8tf9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.565058 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4" (OuterVolumeSpecName: "kube-api-access-khgb4") pod "14521451-81b6-4214-883a-cd05a9357517" (UID: "14521451-81b6-4214-883a-cd05a9357517"). InnerVolumeSpecName "kube-api-access-khgb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.566787 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "14521451-81b6-4214-883a-cd05a9357517" (UID: "14521451-81b6-4214-883a-cd05a9357517"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657340 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tf9t\" (UniqueName: \"kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657390 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657404 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657414 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657425 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657436 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657445 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khgb4\" (UniqueName: \"kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657455 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657464 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.896802 4816 generic.go:334] "Generic (PLEG): container finished" podID="14521451-81b6-4214-883a-cd05a9357517" containerID="83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4" exitCode=0 Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.896875 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" event={"ID":"14521451-81b6-4214-883a-cd05a9357517","Type":"ContainerDied","Data":"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4"} Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.896906 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" event={"ID":"14521451-81b6-4214-883a-cd05a9357517","Type":"ContainerDied","Data":"7fd72c6bda85d63e3016f0f126428c17855b91cdeed58a7acc5610da9342d4f1"} Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.896925 4816 scope.go:117] "RemoveContainer" containerID="83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.897045 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.900196 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" containerID="6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a" exitCode=0 Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.900264 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" event={"ID":"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f","Type":"ContainerDied","Data":"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a"} Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.900324 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" event={"ID":"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f","Type":"ContainerDied","Data":"916114dd72baadc2e3d1c4e882df4092bba32ea74a936ec4e52471a9ade09699"} Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.900276 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.904887 4816 generic.go:334] "Generic (PLEG): container finished" podID="5160d394-3d9b-4066-9bea-b9dd787b2a42" containerID="d0a220f8f08fc88ffdf56d37ec2ba1b59974be62f3a81d988b1462b4794a79a8" exitCode=0 Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.904959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" event={"ID":"5160d394-3d9b-4066-9bea-b9dd787b2a42","Type":"ContainerDied","Data":"d0a220f8f08fc88ffdf56d37ec2ba1b59974be62f3a81d988b1462b4794a79a8"} Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.918217 4816 scope.go:117] "RemoveContainer" containerID="83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4" Mar 16 00:14:02 crc kubenswrapper[4816]: E0316 00:14:02.918618 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4\": container with ID starting with 83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4 not found: ID does not exist" containerID="83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.918648 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4"} err="failed to get container status \"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4\": rpc error: code = NotFound desc = could not find container \"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4\": container with ID starting with 83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4 not found: ID does not exist" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.919120 4816 scope.go:117] "RemoveContainer" containerID="6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.935397 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.937211 4816 scope.go:117] "RemoveContainer" containerID="6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a" Mar 16 00:14:02 crc kubenswrapper[4816]: E0316 00:14:02.937664 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a\": container with ID starting with 6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a not found: ID does not exist" containerID="6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.937707 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a"} err="failed to get container status \"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a\": rpc error: code = NotFound desc = could not find container \"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a\": container with ID starting with 6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a not found: ID does not exist" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.938811 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.949230 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.955394 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.522183 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:03 crc kubenswrapper[4816]: E0316 00:14:03.522526 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14521451-81b6-4214-883a-cd05a9357517" containerName="controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.522594 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="14521451-81b6-4214-883a-cd05a9357517" containerName="controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: E0316 00:14:03.522632 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" containerName="route-controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.522649 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" containerName="route-controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.522819 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="14521451-81b6-4214-883a-cd05a9357517" containerName="controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.522857 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" containerName="route-controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.523446 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.525309 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.525901 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.526059 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.526355 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.526632 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.528718 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.530322 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.531152 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.534132 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.534153 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.534156 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.535881 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.535970 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.537219 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.540988 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.541721 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.546466 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569052 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569345 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569503 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569653 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569826 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569905 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b56ft\" (UniqueName: \"kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569941 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpr8t\" (UniqueName: \"kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569988 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.570016 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670533 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670630 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670688 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670752 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670785 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b56ft\" (UniqueName: \"kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670806 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpr8t\" (UniqueName: \"kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670835 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670855 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670904 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.671942 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.672801 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.672951 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.674397 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.675541 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.676047 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14521451-81b6-4214-883a-cd05a9357517" path="/var/lib/kubelet/pods/14521451-81b6-4214-883a-cd05a9357517/volumes" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.676763 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" path="/var/lib/kubelet/pods/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f/volumes" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.686411 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.687308 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.690032 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpr8t\" (UniqueName: \"kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.692239 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b56ft\" (UniqueName: \"kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.850173 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.863869 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.166111 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.177170 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5d8q\" (UniqueName: \"kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q\") pod \"5160d394-3d9b-4066-9bea-b9dd787b2a42\" (UID: \"5160d394-3d9b-4066-9bea-b9dd787b2a42\") " Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.182119 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q" (OuterVolumeSpecName: "kube-api-access-t5d8q") pod "5160d394-3d9b-4066-9bea-b9dd787b2a42" (UID: "5160d394-3d9b-4066-9bea-b9dd787b2a42"). InnerVolumeSpecName "kube-api-access-t5d8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.277670 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5d8q\" (UniqueName: \"kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.304409 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:04 crc kubenswrapper[4816]: W0316 00:14:04.309436 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd54cdab6_e848_4a30_b64f_7b257a403479.slice/crio-740a40d4e9799ef7106757215750d89d9a0af1e6c2b93294ab89c6e7631aa93f WatchSource:0}: Error finding container 740a40d4e9799ef7106757215750d89d9a0af1e6c2b93294ab89c6e7631aa93f: Status 404 returned error can't find the container with id 740a40d4e9799ef7106757215750d89d9a0af1e6c2b93294ab89c6e7631aa93f Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.361696 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.917371 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" event={"ID":"5160d394-3d9b-4066-9bea-b9dd787b2a42","Type":"ContainerDied","Data":"a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8"} Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.917719 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.917632 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.919047 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" event={"ID":"ee03d74f-fc71-4caa-b296-7bde75124d84","Type":"ContainerStarted","Data":"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848"} Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.919098 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" event={"ID":"ee03d74f-fc71-4caa-b296-7bde75124d84","Type":"ContainerStarted","Data":"3979922d6fce0cead24bc526158f3a5bdcfa05832f696a88b2f4edcb0bfa3aa5"} Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.919346 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.921382 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" event={"ID":"d54cdab6-e848-4a30-b64f-7b257a403479","Type":"ContainerStarted","Data":"f446c2ae65638de11812a4e1adcc4638681b15af837aa10730e65bcb03368dfd"} Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.921766 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" event={"ID":"d54cdab6-e848-4a30-b64f-7b257a403479","Type":"ContainerStarted","Data":"740a40d4e9799ef7106757215750d89d9a0af1e6c2b93294ab89c6e7631aa93f"} Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.921798 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.926430 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.926870 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.940771 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" podStartSLOduration=2.940753571 podStartE2EDuration="2.940753571s" podCreationTimestamp="2026-03-16 00:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:14:04.938971989 +0000 UTC m=+438.035271942" watchObservedRunningTime="2026-03-16 00:14:04.940753571 +0000 UTC m=+438.037053524" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.980810 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" podStartSLOduration=3.98078479 podStartE2EDuration="3.98078479s" podCreationTimestamp="2026-03-16 00:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:14:04.979651817 +0000 UTC m=+438.075951780" watchObservedRunningTime="2026-03-16 00:14:04.98078479 +0000 UTC m=+438.077084743" Mar 16 00:14:21 crc kubenswrapper[4816]: I0316 00:14:21.898170 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:21 crc kubenswrapper[4816]: I0316 00:14:21.899021 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" podUID="d54cdab6-e848-4a30-b64f-7b257a403479" containerName="controller-manager" containerID="cri-o://f446c2ae65638de11812a4e1adcc4638681b15af837aa10730e65bcb03368dfd" gracePeriod=30 Mar 16 00:14:21 crc kubenswrapper[4816]: I0316 00:14:21.931320 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:21 crc kubenswrapper[4816]: I0316 00:14:21.931531 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" podUID="ee03d74f-fc71-4caa-b296-7bde75124d84" containerName="route-controller-manager" containerID="cri-o://0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848" gracePeriod=30 Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.035351 4816 generic.go:334] "Generic (PLEG): container finished" podID="d54cdab6-e848-4a30-b64f-7b257a403479" containerID="f446c2ae65638de11812a4e1adcc4638681b15af837aa10730e65bcb03368dfd" exitCode=0 Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.035393 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" event={"ID":"d54cdab6-e848-4a30-b64f-7b257a403479","Type":"ContainerDied","Data":"f446c2ae65638de11812a4e1adcc4638681b15af837aa10730e65bcb03368dfd"} Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.446770 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.544368 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626299 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert\") pod \"ee03d74f-fc71-4caa-b296-7bde75124d84\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626354 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config\") pod \"ee03d74f-fc71-4caa-b296-7bde75124d84\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626478 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config\") pod \"d54cdab6-e848-4a30-b64f-7b257a403479\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626506 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b56ft\" (UniqueName: \"kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft\") pod \"ee03d74f-fc71-4caa-b296-7bde75124d84\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626540 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles\") pod \"d54cdab6-e848-4a30-b64f-7b257a403479\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626581 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca\") pod \"ee03d74f-fc71-4caa-b296-7bde75124d84\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627240 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config" (OuterVolumeSpecName: "config") pod "ee03d74f-fc71-4caa-b296-7bde75124d84" (UID: "ee03d74f-fc71-4caa-b296-7bde75124d84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627308 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpr8t\" (UniqueName: \"kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t\") pod \"d54cdab6-e848-4a30-b64f-7b257a403479\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627337 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca\") pod \"d54cdab6-e848-4a30-b64f-7b257a403479\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627456 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d54cdab6-e848-4a30-b64f-7b257a403479" (UID: "d54cdab6-e848-4a30-b64f-7b257a403479"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627502 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627501 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config" (OuterVolumeSpecName: "config") pod "d54cdab6-e848-4a30-b64f-7b257a403479" (UID: "d54cdab6-e848-4a30-b64f-7b257a403479"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627739 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca" (OuterVolumeSpecName: "client-ca") pod "d54cdab6-e848-4a30-b64f-7b257a403479" (UID: "d54cdab6-e848-4a30-b64f-7b257a403479"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627927 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca" (OuterVolumeSpecName: "client-ca") pod "ee03d74f-fc71-4caa-b296-7bde75124d84" (UID: "ee03d74f-fc71-4caa-b296-7bde75124d84"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.632418 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ee03d74f-fc71-4caa-b296-7bde75124d84" (UID: "ee03d74f-fc71-4caa-b296-7bde75124d84"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.632478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft" (OuterVolumeSpecName: "kube-api-access-b56ft") pod "ee03d74f-fc71-4caa-b296-7bde75124d84" (UID: "ee03d74f-fc71-4caa-b296-7bde75124d84"). InnerVolumeSpecName "kube-api-access-b56ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.632723 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t" (OuterVolumeSpecName: "kube-api-access-tpr8t") pod "d54cdab6-e848-4a30-b64f-7b257a403479" (UID: "d54cdab6-e848-4a30-b64f-7b257a403479"). InnerVolumeSpecName "kube-api-access-tpr8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert\") pod \"d54cdab6-e848-4a30-b64f-7b257a403479\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728326 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728801 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728824 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b56ft\" (UniqueName: \"kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728837 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728865 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728878 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpr8t\" (UniqueName: \"kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728889 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.730677 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d54cdab6-e848-4a30-b64f-7b257a403479" (UID: "d54cdab6-e848-4a30-b64f-7b257a403479"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.831477 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.044391 4816 generic.go:334] "Generic (PLEG): container finished" podID="ee03d74f-fc71-4caa-b296-7bde75124d84" containerID="0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848" exitCode=0 Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.044582 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.044591 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" event={"ID":"ee03d74f-fc71-4caa-b296-7bde75124d84","Type":"ContainerDied","Data":"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848"} Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.044751 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" event={"ID":"ee03d74f-fc71-4caa-b296-7bde75124d84","Type":"ContainerDied","Data":"3979922d6fce0cead24bc526158f3a5bdcfa05832f696a88b2f4edcb0bfa3aa5"} Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.044788 4816 scope.go:117] "RemoveContainer" containerID="0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.047465 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" event={"ID":"d54cdab6-e848-4a30-b64f-7b257a403479","Type":"ContainerDied","Data":"740a40d4e9799ef7106757215750d89d9a0af1e6c2b93294ab89c6e7631aa93f"} Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.047706 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.068364 4816 scope.go:117] "RemoveContainer" containerID="0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848" Mar 16 00:14:23 crc kubenswrapper[4816]: E0316 00:14:23.070454 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848\": container with ID starting with 0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848 not found: ID does not exist" containerID="0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.070497 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848"} err="failed to get container status \"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848\": rpc error: code = NotFound desc = could not find container \"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848\": container with ID starting with 0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848 not found: ID does not exist" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.070523 4816 scope.go:117] "RemoveContainer" containerID="f446c2ae65638de11812a4e1adcc4638681b15af837aa10730e65bcb03368dfd" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.102754 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.103881 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.119392 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.124057 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.539712 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg"] Mar 16 00:14:23 crc kubenswrapper[4816]: E0316 00:14:23.539978 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5160d394-3d9b-4066-9bea-b9dd787b2a42" containerName="oc" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.539992 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5160d394-3d9b-4066-9bea-b9dd787b2a42" containerName="oc" Mar 16 00:14:23 crc kubenswrapper[4816]: E0316 00:14:23.540019 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54cdab6-e848-4a30-b64f-7b257a403479" containerName="controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540027 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54cdab6-e848-4a30-b64f-7b257a403479" containerName="controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: E0316 00:14:23.540044 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee03d74f-fc71-4caa-b296-7bde75124d84" containerName="route-controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540053 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee03d74f-fc71-4caa-b296-7bde75124d84" containerName="route-controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540167 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54cdab6-e848-4a30-b64f-7b257a403479" containerName="controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540185 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee03d74f-fc71-4caa-b296-7bde75124d84" containerName="route-controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540196 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5160d394-3d9b-4066-9bea-b9dd787b2a42" containerName="oc" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540676 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.544400 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.544482 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.544405 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.545798 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.546149 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.548175 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5959f55db6-fjth7"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.549652 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.549874 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.555862 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.556349 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.556484 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.557120 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.557465 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.560123 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.561085 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.570302 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5959f55db6-fjth7"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.572798 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.673089 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54cdab6-e848-4a30-b64f-7b257a403479" path="/var/lib/kubelet/pods/d54cdab6-e848-4a30-b64f-7b257a403479/volumes" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.673596 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee03d74f-fc71-4caa-b296-7bde75124d84" path="/var/lib/kubelet/pods/ee03d74f-fc71-4caa-b296-7bde75124d84/volumes" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741772 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd82845b-11d2-4f56-baef-9217ec8fb5d9-serving-cert\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741841 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52abc92a-155b-4167-826a-de9f1aa0ce44-serving-cert\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741922 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-proxy-ca-bundles\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741952 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8dvk\" (UniqueName: \"kubernetes.io/projected/bd82845b-11d2-4f56-baef-9217ec8fb5d9-kube-api-access-c8dvk\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741972 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lmzh\" (UniqueName: \"kubernetes.io/projected/52abc92a-155b-4167-826a-de9f1aa0ce44-kube-api-access-6lmzh\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741993 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-client-ca\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.742023 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-config\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.742043 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-client-ca\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.742070 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-config\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843322 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-client-ca\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843425 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-config\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843502 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-client-ca\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843612 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-config\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843712 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd82845b-11d2-4f56-baef-9217ec8fb5d9-serving-cert\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843784 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52abc92a-155b-4167-826a-de9f1aa0ce44-serving-cert\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843855 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-proxy-ca-bundles\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8dvk\" (UniqueName: \"kubernetes.io/projected/bd82845b-11d2-4f56-baef-9217ec8fb5d9-kube-api-access-c8dvk\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843953 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lmzh\" (UniqueName: \"kubernetes.io/projected/52abc92a-155b-4167-826a-de9f1aa0ce44-kube-api-access-6lmzh\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.845531 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-client-ca\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.846351 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-config\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.846455 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-config\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.846532 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-client-ca\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.847389 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-proxy-ca-bundles\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.850086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52abc92a-155b-4167-826a-de9f1aa0ce44-serving-cert\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.857722 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd82845b-11d2-4f56-baef-9217ec8fb5d9-serving-cert\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.876984 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lmzh\" (UniqueName: \"kubernetes.io/projected/52abc92a-155b-4167-826a-de9f1aa0ce44-kube-api-access-6lmzh\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.880883 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8dvk\" (UniqueName: \"kubernetes.io/projected/bd82845b-11d2-4f56-baef-9217ec8fb5d9-kube-api-access-c8dvk\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.881107 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.900786 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:24 crc kubenswrapper[4816]: I0316 00:14:24.206084 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5959f55db6-fjth7"] Mar 16 00:14:24 crc kubenswrapper[4816]: W0316 00:14:24.342067 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52abc92a_155b_4167_826a_de9f1aa0ce44.slice/crio-47458505eb5293d436500049d82178fbb3a456e34ebfa3f32d2a53fddf62df0c WatchSource:0}: Error finding container 47458505eb5293d436500049d82178fbb3a456e34ebfa3f32d2a53fddf62df0c: Status 404 returned error can't find the container with id 47458505eb5293d436500049d82178fbb3a456e34ebfa3f32d2a53fddf62df0c Mar 16 00:14:24 crc kubenswrapper[4816]: I0316 00:14:24.343317 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg"] Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.067672 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" event={"ID":"bd82845b-11d2-4f56-baef-9217ec8fb5d9","Type":"ContainerStarted","Data":"2e215180fdce74e05454572f7c396d2d53b12a2397f880975a9a7a4cdbd2b141"} Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.067997 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" event={"ID":"bd82845b-11d2-4f56-baef-9217ec8fb5d9","Type":"ContainerStarted","Data":"12fb963e32c36ab5a15988d6a8d9cbcef352176f4c3d4d9612d64e86b4de2ee1"} Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.068023 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.069788 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" event={"ID":"52abc92a-155b-4167-826a-de9f1aa0ce44","Type":"ContainerStarted","Data":"91e8e1aaa2727c987a514ce7c4c9cf6da470e267ad18af4e3a5c0d7c59840589"} Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.069817 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" event={"ID":"52abc92a-155b-4167-826a-de9f1aa0ce44","Type":"ContainerStarted","Data":"47458505eb5293d436500049d82178fbb3a456e34ebfa3f32d2a53fddf62df0c"} Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.070038 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.074006 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.075851 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.090987 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" podStartSLOduration=4.090967533 podStartE2EDuration="4.090967533s" podCreationTimestamp="2026-03-16 00:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:14:25.087202793 +0000 UTC m=+458.183502756" watchObservedRunningTime="2026-03-16 00:14:25.090967533 +0000 UTC m=+458.187267506" Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.111014 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" podStartSLOduration=4.110988507 podStartE2EDuration="4.110988507s" podCreationTimestamp="2026-03-16 00:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:14:25.104176008 +0000 UTC m=+458.200475971" watchObservedRunningTime="2026-03-16 00:14:25.110988507 +0000 UTC m=+458.207288500" Mar 16 00:14:31 crc kubenswrapper[4816]: I0316 00:14:31.863368 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:14:31 crc kubenswrapper[4816]: I0316 00:14:31.864062 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:14:47 crc kubenswrapper[4816]: I0316 00:14:47.421644 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.141888 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7"] Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.143063 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.145651 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.145763 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.154453 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7"] Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.315952 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.316056 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9lxp\" (UniqueName: \"kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.316180 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.416931 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.416984 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9lxp\" (UniqueName: \"kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.417032 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.417993 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.423807 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.439201 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9lxp\" (UniqueName: \"kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.462820 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.861877 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7"] Mar 16 00:15:01 crc kubenswrapper[4816]: I0316 00:15:01.305683 4816 generic.go:334] "Generic (PLEG): container finished" podID="3a9c69dc-0684-421d-a7aa-6fb257f59909" containerID="2f274fd6c476ac0e73a688a7bbce794f8e491674e1b0838204690a79b7a28dfd" exitCode=0 Mar 16 00:15:01 crc kubenswrapper[4816]: I0316 00:15:01.305743 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" event={"ID":"3a9c69dc-0684-421d-a7aa-6fb257f59909","Type":"ContainerDied","Data":"2f274fd6c476ac0e73a688a7bbce794f8e491674e1b0838204690a79b7a28dfd"} Mar 16 00:15:01 crc kubenswrapper[4816]: I0316 00:15:01.305772 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" event={"ID":"3a9c69dc-0684-421d-a7aa-6fb257f59909","Type":"ContainerStarted","Data":"648ab9f6adc93138c8533a5442aa38b4b2995f055ec75688ef2547b9f3713571"} Mar 16 00:15:01 crc kubenswrapper[4816]: I0316 00:15:01.863653 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:15:01 crc kubenswrapper[4816]: I0316 00:15:01.863749 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.673026 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.860157 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume\") pod \"3a9c69dc-0684-421d-a7aa-6fb257f59909\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.860248 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume\") pod \"3a9c69dc-0684-421d-a7aa-6fb257f59909\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.860286 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9lxp\" (UniqueName: \"kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp\") pod \"3a9c69dc-0684-421d-a7aa-6fb257f59909\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.861264 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a9c69dc-0684-421d-a7aa-6fb257f59909" (UID: "3a9c69dc-0684-421d-a7aa-6fb257f59909"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.871358 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a9c69dc-0684-421d-a7aa-6fb257f59909" (UID: "3a9c69dc-0684-421d-a7aa-6fb257f59909"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.871543 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp" (OuterVolumeSpecName: "kube-api-access-t9lxp") pod "3a9c69dc-0684-421d-a7aa-6fb257f59909" (UID: "3a9c69dc-0684-421d-a7aa-6fb257f59909"). InnerVolumeSpecName "kube-api-access-t9lxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.961584 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.961643 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9lxp\" (UniqueName: \"kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.961664 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:03 crc kubenswrapper[4816]: I0316 00:15:03.318476 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" event={"ID":"3a9c69dc-0684-421d-a7aa-6fb257f59909","Type":"ContainerDied","Data":"648ab9f6adc93138c8533a5442aa38b4b2995f055ec75688ef2547b9f3713571"} Mar 16 00:15:03 crc kubenswrapper[4816]: I0316 00:15:03.318532 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="648ab9f6adc93138c8533a5442aa38b4b2995f055ec75688ef2547b9f3713571" Mar 16 00:15:03 crc kubenswrapper[4816]: I0316 00:15:03.318611 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.452175 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" containerID="cri-o://e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62" gracePeriod=15 Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.898413 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907436 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907533 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907585 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907607 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907641 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907670 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907701 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907724 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907751 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907794 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907826 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907854 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxdqt\" (UniqueName: \"kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907876 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907902 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.908203 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.910078 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.910231 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.910466 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.911735 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.914733 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.914979 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt" (OuterVolumeSpecName: "kube-api-access-kxdqt") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "kube-api-access-kxdqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.915962 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.916402 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.916670 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.916933 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.917094 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.931718 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.934743 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.953413 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6d789fbbf-wkdfq"] Mar 16 00:15:12 crc kubenswrapper[4816]: E0316 00:15:12.953682 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.953697 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" Mar 16 00:15:12 crc kubenswrapper[4816]: E0316 00:15:12.953710 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9c69dc-0684-421d-a7aa-6fb257f59909" containerName="collect-profiles" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.953719 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9c69dc-0684-421d-a7aa-6fb257f59909" containerName="collect-profiles" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.953904 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9c69dc-0684-421d-a7aa-6fb257f59909" containerName="collect-profiles" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.953949 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.954679 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.970169 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6d789fbbf-wkdfq"] Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.008895 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-login\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.008936 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.008965 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.008984 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009008 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26a979e8-1691-4390-82da-4229125eb297-audit-dir\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009027 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-session\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009045 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-error\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009080 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009102 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009121 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009140 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-audit-policies\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009157 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js2jb\" (UniqueName: \"kubernetes.io/projected/26a979e8-1691-4390-82da-4229125eb297-kube-api-access-js2jb\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009178 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009228 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009240 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009251 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009260 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009269 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009278 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009288 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009297 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009307 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009319 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009329 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009338 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxdqt\" (UniqueName: \"kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009349 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009358 4816 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.110834 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26a979e8-1691-4390-82da-4229125eb297-audit-dir\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.110872 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-session\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.110906 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-error\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.110987 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111039 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111081 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111110 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111135 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-audit-policies\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111155 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js2jb\" (UniqueName: \"kubernetes.io/projected/26a979e8-1691-4390-82da-4229125eb297-kube-api-access-js2jb\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111186 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111234 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-login\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111285 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111316 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111346 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.112862 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.113110 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-audit-policies\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.110931 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26a979e8-1691-4390-82da-4229125eb297-audit-dir\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.114066 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.114913 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.115779 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-error\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.116461 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-session\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.117576 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.118067 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-login\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.118247 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.119485 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.120130 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.124329 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.130449 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js2jb\" (UniqueName: \"kubernetes.io/projected/26a979e8-1691-4390-82da-4229125eb297-kube-api-access-js2jb\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.290075 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.383202 4816 generic.go:334] "Generic (PLEG): container finished" podID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerID="e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62" exitCode=0 Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.383258 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" event={"ID":"7c3e347f-464a-43f1-bf29-689bf81a28e6","Type":"ContainerDied","Data":"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62"} Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.383275 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.383300 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" event={"ID":"7c3e347f-464a-43f1-bf29-689bf81a28e6","Type":"ContainerDied","Data":"3631ced358fcea8ef22224f7b1a8e3a7674d52e4a7296b38cf119840b4577b45"} Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.383324 4816 scope.go:117] "RemoveContainer" containerID="e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.412915 4816 scope.go:117] "RemoveContainer" containerID="e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62" Mar 16 00:15:13 crc kubenswrapper[4816]: E0316 00:15:13.413785 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62\": container with ID starting with e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62 not found: ID does not exist" containerID="e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.413829 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62"} err="failed to get container status \"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62\": rpc error: code = NotFound desc = could not find container \"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62\": container with ID starting with e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62 not found: ID does not exist" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.428387 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.433113 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.681420 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" path="/var/lib/kubelet/pods/7c3e347f-464a-43f1-bf29-689bf81a28e6/volumes" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.745937 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6d789fbbf-wkdfq"] Mar 16 00:15:14 crc kubenswrapper[4816]: I0316 00:15:14.390134 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" event={"ID":"26a979e8-1691-4390-82da-4229125eb297","Type":"ContainerStarted","Data":"6f88b1b3cf345f29131fb6d7972de6f49cd3d8631240363894017f0427d4e311"} Mar 16 00:15:14 crc kubenswrapper[4816]: I0316 00:15:14.391619 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" event={"ID":"26a979e8-1691-4390-82da-4229125eb297","Type":"ContainerStarted","Data":"9017a83ec7c036dddf975c0a4343a653f66606edad95f40aaaa39233661e7659"} Mar 16 00:15:14 crc kubenswrapper[4816]: I0316 00:15:14.391725 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:14 crc kubenswrapper[4816]: I0316 00:15:14.413862 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:14 crc kubenswrapper[4816]: I0316 00:15:14.444274 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" podStartSLOduration=27.444244887 podStartE2EDuration="27.444244887s" podCreationTimestamp="2026-03-16 00:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:15:14.410114751 +0000 UTC m=+507.506414724" watchObservedRunningTime="2026-03-16 00:15:14.444244887 +0000 UTC m=+507.540544880" Mar 16 00:15:31 crc kubenswrapper[4816]: I0316 00:15:31.863008 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:15:31 crc kubenswrapper[4816]: I0316 00:15:31.864614 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:15:31 crc kubenswrapper[4816]: I0316 00:15:31.864753 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:15:31 crc kubenswrapper[4816]: I0316 00:15:31.865258 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:15:31 crc kubenswrapper[4816]: I0316 00:15:31.865387 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b" gracePeriod=600 Mar 16 00:15:32 crc kubenswrapper[4816]: I0316 00:15:32.513976 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b" exitCode=0 Mar 16 00:15:32 crc kubenswrapper[4816]: I0316 00:15:32.514063 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b"} Mar 16 00:15:32 crc kubenswrapper[4816]: I0316 00:15:32.514382 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4"} Mar 16 00:15:32 crc kubenswrapper[4816]: I0316 00:15:32.514407 4816 scope.go:117] "RemoveContainer" containerID="7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.531520 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.532406 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wh2h7" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="registry-server" containerID="cri-o://0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d" gracePeriod=30 Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.551441 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.551797 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4gwcw" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="registry-server" containerID="cri-o://1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" gracePeriod=30 Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.571788 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.572230 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" podUID="02854230-6165-4f22-8780-d8591b991132" containerName="marketplace-operator" containerID="cri-o://6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e" gracePeriod=30 Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.577520 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.577846 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7pb49" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="registry-server" containerID="cri-o://625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf" gracePeriod=30 Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.597177 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.598967 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-52qs6" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="registry-server" containerID="cri-o://f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2" gracePeriod=30 Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.601712 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8ln7g"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.602450 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.616327 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8ln7g"] Mar 16 00:15:52 crc kubenswrapper[4816]: E0316 00:15:52.666362 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc is running failed: container process not found" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:15:52 crc kubenswrapper[4816]: E0316 00:15:52.666706 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc is running failed: container process not found" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:15:52 crc kubenswrapper[4816]: E0316 00:15:52.666978 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc is running failed: container process not found" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:15:52 crc kubenswrapper[4816]: E0316 00:15:52.667040 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-4gwcw" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="registry-server" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.686682 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.686739 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.686769 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twhq\" (UniqueName: \"kubernetes.io/projected/6d197f63-0b7c-496d-89bb-9cd70933969a-kube-api-access-7twhq\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: E0316 00:15:52.694232 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02854230_6165_4f22_8780_d8591b991132.slice/crio-6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1b3efd0_cdc0_4973_8077_bcd1ea567bdd.slice/crio-0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad80e1a9_75dc_4860_9bd9_d59b0c0ae43c.slice/crio-1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc.scope\": RecentStats: unable to find data in memory cache]" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.796352 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.796426 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.796457 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7twhq\" (UniqueName: \"kubernetes.io/projected/6d197f63-0b7c-496d-89bb-9cd70933969a-kube-api-access-7twhq\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.798621 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.807298 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.820369 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twhq\" (UniqueName: \"kubernetes.io/projected/6d197f63-0b7c-496d-89bb-9cd70933969a-kube-api-access-7twhq\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.937738 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.019457 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.107378 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities\") pod \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.107430 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4mjs\" (UniqueName: \"kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs\") pod \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.107516 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content\") pod \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.108809 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities" (OuterVolumeSpecName: "utilities") pod "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" (UID: "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.120872 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs" (OuterVolumeSpecName: "kube-api-access-w4mjs") pod "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" (UID: "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd"). InnerVolumeSpecName "kube-api-access-w4mjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.132478 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.144406 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.149301 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.164584 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" (UID: "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.181385 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.208153 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrpff\" (UniqueName: \"kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff\") pod \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.208263 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content\") pod \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.208311 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") pod \"02854230-6165-4f22-8780-d8591b991132\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.208373 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content\") pod \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "02854230-6165-4f22-8780-d8591b991132" (UID: "02854230-6165-4f22-8780-d8591b991132"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209164 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content\") pod \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209198 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j24xj\" (UniqueName: \"kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj\") pod \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209237 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") pod \"02854230-6165-4f22-8780-d8591b991132\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209268 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgh99\" (UniqueName: \"kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99\") pod \"02854230-6165-4f22-8780-d8591b991132\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209290 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45bbd\" (UniqueName: \"kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd\") pod \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209318 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities\") pod \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209334 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities\") pod \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209355 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities\") pod \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209753 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209772 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209786 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209795 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4mjs\" (UniqueName: \"kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.210754 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities" (OuterVolumeSpecName: "utilities") pod "a5ba22dd-8e8e-4beb-a540-e5c9687810b8" (UID: "a5ba22dd-8e8e-4beb-a540-e5c9687810b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.212214 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities" (OuterVolumeSpecName: "utilities") pod "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" (UID: "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.213061 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities" (OuterVolumeSpecName: "utilities") pod "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" (UID: "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.214407 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj" (OuterVolumeSpecName: "kube-api-access-j24xj") pod "a5ba22dd-8e8e-4beb-a540-e5c9687810b8" (UID: "a5ba22dd-8e8e-4beb-a540-e5c9687810b8"). InnerVolumeSpecName "kube-api-access-j24xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.214511 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff" (OuterVolumeSpecName: "kube-api-access-mrpff") pod "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" (UID: "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d"). InnerVolumeSpecName "kube-api-access-mrpff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.214620 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "02854230-6165-4f22-8780-d8591b991132" (UID: "02854230-6165-4f22-8780-d8591b991132"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.214730 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99" (OuterVolumeSpecName: "kube-api-access-zgh99") pod "02854230-6165-4f22-8780-d8591b991132" (UID: "02854230-6165-4f22-8780-d8591b991132"). InnerVolumeSpecName "kube-api-access-zgh99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.222236 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd" (OuterVolumeSpecName: "kube-api-access-45bbd") pod "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" (UID: "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c"). InnerVolumeSpecName "kube-api-access-45bbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.238095 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5ba22dd-8e8e-4beb-a540-e5c9687810b8" (UID: "a5ba22dd-8e8e-4beb-a540-e5c9687810b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.266119 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" (UID: "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311273 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311328 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgh99\" (UniqueName: \"kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311341 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45bbd\" (UniqueName: \"kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311356 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311371 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311382 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311393 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrpff\" (UniqueName: \"kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311404 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311413 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311424 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j24xj\" (UniqueName: \"kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.337414 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" (UID: "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.412879 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.472356 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8ln7g"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.645084 4816 generic.go:334] "Generic (PLEG): container finished" podID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerID="f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2" exitCode=0 Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.645141 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerDied","Data":"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.645169 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerDied","Data":"47689d47c5b861a3bd4357a2faba7a8ab87d56775475b31d461c37bf8423f524"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.645190 4816 scope.go:117] "RemoveContainer" containerID="f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.645299 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.654715 4816 generic.go:334] "Generic (PLEG): container finished" podID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerID="625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf" exitCode=0 Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.654780 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerDied","Data":"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.654807 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerDied","Data":"7718a309c71ba8a48a463087b2e901f51d954ea050a7be786e3c0a847d6a54eb"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.654868 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.659687 4816 generic.go:334] "Generic (PLEG): container finished" podID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" exitCode=0 Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.659756 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerDied","Data":"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.659788 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerDied","Data":"661437598c338aed0d5a7d52e67330434003899adaefd998268791f6175ab8ca"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.659861 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.672345 4816 generic.go:334] "Generic (PLEG): container finished" podID="02854230-6165-4f22-8780-d8591b991132" containerID="6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e" exitCode=0 Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.672478 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.676711 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" event={"ID":"02854230-6165-4f22-8780-d8591b991132","Type":"ContainerDied","Data":"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.676754 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" event={"ID":"02854230-6165-4f22-8780-d8591b991132","Type":"ContainerDied","Data":"fbc545a6e69e36c7e153d8947909848cfdb5be666c80ed949869b9fabb25d45a"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.680976 4816 generic.go:334] "Generic (PLEG): container finished" podID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerID="0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d" exitCode=0 Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.681092 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerDied","Data":"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.681118 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerDied","Data":"0e89bdbfb4ed11608191b3360966bdeb2f13767d41154d3097545518437bcaec"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.681198 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.686149 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" event={"ID":"6d197f63-0b7c-496d-89bb-9cd70933969a","Type":"ContainerStarted","Data":"cd3d797954a516d4cc00dfe574bc7782894689d4edd8ec9d9626045825209edb"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.686192 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" event={"ID":"6d197f63-0b7c-496d-89bb-9cd70933969a","Type":"ContainerStarted","Data":"e8b07ddb778279b84b392d5ba788140e7de4b1ade3517d804c46c4a859a16c55"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.687828 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.689187 4816 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8ln7g container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.76:8080/healthz\": dial tcp 10.217.0.76:8080: connect: connection refused" start-of-body= Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.689245 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" podUID="6d197f63-0b7c-496d-89bb-9cd70933969a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.76:8080/healthz\": dial tcp 10.217.0.76:8080: connect: connection refused" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.702423 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" podStartSLOduration=1.702402886 podStartE2EDuration="1.702402886s" podCreationTimestamp="2026-03-16 00:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:15:53.699012477 +0000 UTC m=+546.795312440" watchObservedRunningTime="2026-03-16 00:15:53.702402886 +0000 UTC m=+546.798702839" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.709695 4816 scope.go:117] "RemoveContainer" containerID="43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.741671 4816 scope.go:117] "RemoveContainer" containerID="056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.763167 4816 scope.go:117] "RemoveContainer" containerID="f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.765380 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2\": container with ID starting with f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2 not found: ID does not exist" containerID="f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.765413 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2"} err="failed to get container status \"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2\": rpc error: code = NotFound desc = could not find container \"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2\": container with ID starting with f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.765436 4816 scope.go:117] "RemoveContainer" containerID="43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.765789 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72\": container with ID starting with 43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72 not found: ID does not exist" containerID="43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.765811 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72"} err="failed to get container status \"43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72\": rpc error: code = NotFound desc = could not find container \"43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72\": container with ID starting with 43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.765823 4816 scope.go:117] "RemoveContainer" containerID="056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.766179 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503\": container with ID starting with 056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503 not found: ID does not exist" containerID="056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.766206 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503"} err="failed to get container status \"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503\": rpc error: code = NotFound desc = could not find container \"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503\": container with ID starting with 056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.766218 4816 scope.go:117] "RemoveContainer" containerID="625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.770117 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.774507 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.778857 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.782268 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.791570 4816 scope.go:117] "RemoveContainer" containerID="908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.795107 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.799525 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.810151 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.818063 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.820050 4816 scope.go:117] "RemoveContainer" containerID="f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.821644 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.825445 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.832858 4816 scope.go:117] "RemoveContainer" containerID="625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.833165 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf\": container with ID starting with 625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf not found: ID does not exist" containerID="625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.833195 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf"} err="failed to get container status \"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf\": rpc error: code = NotFound desc = could not find container \"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf\": container with ID starting with 625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.833216 4816 scope.go:117] "RemoveContainer" containerID="908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.833541 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc\": container with ID starting with 908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc not found: ID does not exist" containerID="908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.833576 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc"} err="failed to get container status \"908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc\": rpc error: code = NotFound desc = could not find container \"908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc\": container with ID starting with 908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.833589 4816 scope.go:117] "RemoveContainer" containerID="f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.833933 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f\": container with ID starting with f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f not found: ID does not exist" containerID="f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.833983 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f"} err="failed to get container status \"f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f\": rpc error: code = NotFound desc = could not find container \"f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f\": container with ID starting with f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.834018 4816 scope.go:117] "RemoveContainer" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.847205 4816 scope.go:117] "RemoveContainer" containerID="67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.859988 4816 scope.go:117] "RemoveContainer" containerID="16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.874106 4816 scope.go:117] "RemoveContainer" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.874492 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc\": container with ID starting with 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc not found: ID does not exist" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.874521 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc"} err="failed to get container status \"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc\": rpc error: code = NotFound desc = could not find container \"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc\": container with ID starting with 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.874563 4816 scope.go:117] "RemoveContainer" containerID="67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.875047 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317\": container with ID starting with 67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317 not found: ID does not exist" containerID="67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.875067 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317"} err="failed to get container status \"67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317\": rpc error: code = NotFound desc = could not find container \"67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317\": container with ID starting with 67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.875079 4816 scope.go:117] "RemoveContainer" containerID="16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.875274 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89\": container with ID starting with 16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89 not found: ID does not exist" containerID="16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.875294 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89"} err="failed to get container status \"16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89\": rpc error: code = NotFound desc = could not find container \"16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89\": container with ID starting with 16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.875307 4816 scope.go:117] "RemoveContainer" containerID="6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.893291 4816 scope.go:117] "RemoveContainer" containerID="6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.894781 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e\": container with ID starting with 6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e not found: ID does not exist" containerID="6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.894809 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e"} err="failed to get container status \"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e\": rpc error: code = NotFound desc = could not find container \"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e\": container with ID starting with 6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.894855 4816 scope.go:117] "RemoveContainer" containerID="0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.909308 4816 scope.go:117] "RemoveContainer" containerID="c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.925903 4816 scope.go:117] "RemoveContainer" containerID="c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.939929 4816 scope.go:117] "RemoveContainer" containerID="0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.940421 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d\": container with ID starting with 0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d not found: ID does not exist" containerID="0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.940451 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d"} err="failed to get container status \"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d\": rpc error: code = NotFound desc = could not find container \"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d\": container with ID starting with 0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.940471 4816 scope.go:117] "RemoveContainer" containerID="c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.940947 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1\": container with ID starting with c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1 not found: ID does not exist" containerID="c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.940970 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1"} err="failed to get container status \"c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1\": rpc error: code = NotFound desc = could not find container \"c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1\": container with ID starting with c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.940986 4816 scope.go:117] "RemoveContainer" containerID="c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.941189 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638\": container with ID starting with c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638 not found: ID does not exist" containerID="c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.941210 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638"} err="failed to get container status \"c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638\": rpc error: code = NotFound desc = could not find container \"c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638\": container with ID starting with c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638 not found: ID does not exist" Mar 16 00:15:54 crc kubenswrapper[4816]: I0316 00:15:54.698099 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.545859 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546105 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546120 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546136 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546144 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546156 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546164 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546175 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546183 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546196 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546203 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546210 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546218 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546229 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546236 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546246 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546255 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546263 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546272 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546283 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546290 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546303 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546310 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546318 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546328 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546339 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02854230-6165-4f22-8780-d8591b991132" containerName="marketplace-operator" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546347 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="02854230-6165-4f22-8780-d8591b991132" containerName="marketplace-operator" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546456 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="02854230-6165-4f22-8780-d8591b991132" containerName="marketplace-operator" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546467 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546477 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546491 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546498 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.547318 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.552294 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.558920 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.643972 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh6xk\" (UniqueName: \"kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.644021 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.644234 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.674090 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02854230-6165-4f22-8780-d8591b991132" path="/var/lib/kubelet/pods/02854230-6165-4f22-8780-d8591b991132/volumes" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.674653 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" path="/var/lib/kubelet/pods/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d/volumes" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.675194 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" path="/var/lib/kubelet/pods/a5ba22dd-8e8e-4beb-a540-e5c9687810b8/volumes" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.676182 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" path="/var/lib/kubelet/pods/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c/volumes" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.676755 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" path="/var/lib/kubelet/pods/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd/volumes" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.745092 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.745203 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh6xk\" (UniqueName: \"kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.745223 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.745728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.746208 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.769958 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh6xk\" (UniqueName: \"kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.864933 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.147335 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hmzx7"] Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.148892 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.151967 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.163797 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmzx7"] Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.250882 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xct\" (UniqueName: \"kubernetes.io/projected/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-kube-api-access-w2xct\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.250945 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-utilities\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.250998 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-catalog-content\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.254597 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.352284 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-catalog-content\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.352409 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xct\" (UniqueName: \"kubernetes.io/projected/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-kube-api-access-w2xct\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.352454 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-utilities\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.352878 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-catalog-content\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.353012 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-utilities\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.372332 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xct\" (UniqueName: \"kubernetes.io/projected/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-kube-api-access-w2xct\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.480667 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.707241 4816 generic.go:334] "Generic (PLEG): container finished" podID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerID="4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096" exitCode=0 Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.707348 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerDied","Data":"4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096"} Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.707402 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerStarted","Data":"3956bdc0939ca6c80a18b82143c55a4cfebb9af362a0d61193b0fe36b4f051bd"} Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.710624 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.892605 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmzx7"] Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.714460 4816 generic.go:334] "Generic (PLEG): container finished" podID="6df1dc3a-6abd-4ffc-b27b-e66f281ed273" containerID="6514aa2b53586e6671a19991e43ae80c50682b23666589c71c32e64209a97e8f" exitCode=0 Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.714810 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmzx7" event={"ID":"6df1dc3a-6abd-4ffc-b27b-e66f281ed273","Type":"ContainerDied","Data":"6514aa2b53586e6671a19991e43ae80c50682b23666589c71c32e64209a97e8f"} Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.714836 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmzx7" event={"ID":"6df1dc3a-6abd-4ffc-b27b-e66f281ed273","Type":"ContainerStarted","Data":"92f2128bfb20f3e54453e51680d3555a1725778a5428366b87af7ee5ed62f8a2"} Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.951467 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8jcgw"] Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.953035 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.959361 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.963121 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jcgw"] Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.992101 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-utilities\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.992159 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-catalog-content\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.992391 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd6xx\" (UniqueName: \"kubernetes.io/projected/249ae30f-a698-43f3-9464-24868dff2ad6-kube-api-access-zd6xx\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.093532 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd6xx\" (UniqueName: \"kubernetes.io/projected/249ae30f-a698-43f3-9464-24868dff2ad6-kube-api-access-zd6xx\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.093613 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-utilities\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.093637 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-catalog-content\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.094085 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-catalog-content\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.094489 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-utilities\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.117955 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd6xx\" (UniqueName: \"kubernetes.io/projected/249ae30f-a698-43f3-9464-24868dff2ad6-kube-api-access-zd6xx\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.287172 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.547407 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.548886 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.556978 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.563352 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.600973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pfdf\" (UniqueName: \"kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.601133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.601213 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.700052 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jcgw"] Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.702300 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.702398 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.702449 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pfdf\" (UniqueName: \"kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.702923 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.703018 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: W0316 00:15:58.707649 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod249ae30f_a698_43f3_9464_24868dff2ad6.slice/crio-77c4301a4c4eb164930f0ce4e68ef45c27dbf7e455ddd8186c4d632ca28ba588 WatchSource:0}: Error finding container 77c4301a4c4eb164930f0ce4e68ef45c27dbf7e455ddd8186c4d632ca28ba588: Status 404 returned error can't find the container with id 77c4301a4c4eb164930f0ce4e68ef45c27dbf7e455ddd8186c4d632ca28ba588 Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.722752 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jcgw" event={"ID":"249ae30f-a698-43f3-9464-24868dff2ad6","Type":"ContainerStarted","Data":"77c4301a4c4eb164930f0ce4e68ef45c27dbf7e455ddd8186c4d632ca28ba588"} Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.723388 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pfdf\" (UniqueName: \"kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.724906 4816 generic.go:334] "Generic (PLEG): container finished" podID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerID="6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64" exitCode=0 Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.724939 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerDied","Data":"6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64"} Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.876072 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.257034 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:15:59 crc kubenswrapper[4816]: W0316 00:15:59.259481 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1b1f79_de52_4ade_9a72_69b86c55e8ff.slice/crio-d6634923f047727775df02d4d821820bfe16c08bbd5f740d3677d67d9b993223 WatchSource:0}: Error finding container d6634923f047727775df02d4d821820bfe16c08bbd5f740d3677d67d9b993223: Status 404 returned error can't find the container with id d6634923f047727775df02d4d821820bfe16c08bbd5f740d3677d67d9b993223 Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.738340 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerID="5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005" exitCode=0 Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.738401 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerDied","Data":"5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005"} Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.738440 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerStarted","Data":"d6634923f047727775df02d4d821820bfe16c08bbd5f740d3677d67d9b993223"} Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.742283 4816 generic.go:334] "Generic (PLEG): container finished" podID="249ae30f-a698-43f3-9464-24868dff2ad6" containerID="04a1a6b256be09e48052d9b9924ffda3f589a8015e9d1bf56580d8d275ea83bd" exitCode=0 Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.742325 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jcgw" event={"ID":"249ae30f-a698-43f3-9464-24868dff2ad6","Type":"ContainerDied","Data":"04a1a6b256be09e48052d9b9924ffda3f589a8015e9d1bf56580d8d275ea83bd"} Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.744713 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerStarted","Data":"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a"} Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.748048 4816 generic.go:334] "Generic (PLEG): container finished" podID="6df1dc3a-6abd-4ffc-b27b-e66f281ed273" containerID="c323718467b7a05f9e466cb8c30f184578db98b430cc00f4184970fe4c9c9980" exitCode=0 Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.748070 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmzx7" event={"ID":"6df1dc3a-6abd-4ffc-b27b-e66f281ed273","Type":"ContainerDied","Data":"c323718467b7a05f9e466cb8c30f184578db98b430cc00f4184970fe4c9c9980"} Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.137849 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nvgvc" podStartSLOduration=2.5773334180000003 podStartE2EDuration="5.137829848s" podCreationTimestamp="2026-03-16 00:15:55 +0000 UTC" firstStartedPulling="2026-03-16 00:15:56.708665323 +0000 UTC m=+549.804965286" lastFinishedPulling="2026-03-16 00:15:59.269161763 +0000 UTC m=+552.365461716" observedRunningTime="2026-03-16 00:15:59.813235378 +0000 UTC m=+552.909535341" watchObservedRunningTime="2026-03-16 00:16:00.137829848 +0000 UTC m=+553.234129811" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.141714 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560336-fncq8"] Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.142510 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.144671 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.144985 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.145857 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.157501 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-fncq8"] Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.221730 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcq2v\" (UniqueName: \"kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v\") pod \"auto-csr-approver-29560336-fncq8\" (UID: \"b478a542-14c7-4cca-9f95-64766b34df27\") " pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.322592 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcq2v\" (UniqueName: \"kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v\") pod \"auto-csr-approver-29560336-fncq8\" (UID: \"b478a542-14c7-4cca-9f95-64766b34df27\") " pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.346126 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcq2v\" (UniqueName: \"kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v\") pod \"auto-csr-approver-29560336-fncq8\" (UID: \"b478a542-14c7-4cca-9f95-64766b34df27\") " pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.457929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.661013 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-fncq8"] Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.754466 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-fncq8" event={"ID":"b478a542-14c7-4cca-9f95-64766b34df27","Type":"ContainerStarted","Data":"f07774f669536f75976daaac2514712b31c1e4c589e53aab8a4efeae7ad978ba"} Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.756269 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmzx7" event={"ID":"6df1dc3a-6abd-4ffc-b27b-e66f281ed273","Type":"ContainerStarted","Data":"23f016db7cf617088f08092141274ee3b1304f41b7b54130bb94c977188f621d"} Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.779565 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hmzx7" podStartSLOduration=2.112517182 podStartE2EDuration="4.779512069s" podCreationTimestamp="2026-03-16 00:15:56 +0000 UTC" firstStartedPulling="2026-03-16 00:15:57.72125218 +0000 UTC m=+550.817552133" lastFinishedPulling="2026-03-16 00:16:00.388247067 +0000 UTC m=+553.484547020" observedRunningTime="2026-03-16 00:16:00.774463121 +0000 UTC m=+553.870763094" watchObservedRunningTime="2026-03-16 00:16:00.779512069 +0000 UTC m=+553.875812022" Mar 16 00:16:01 crc kubenswrapper[4816]: I0316 00:16:01.764051 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerID="083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce" exitCode=0 Mar 16 00:16:01 crc kubenswrapper[4816]: I0316 00:16:01.764177 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerDied","Data":"083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce"} Mar 16 00:16:01 crc kubenswrapper[4816]: I0316 00:16:01.775277 4816 generic.go:334] "Generic (PLEG): container finished" podID="249ae30f-a698-43f3-9464-24868dff2ad6" containerID="57ff0505ecaf816a1504b57a15468a94a9a38d0dae1d76628212d7c0e0c8e261" exitCode=0 Mar 16 00:16:01 crc kubenswrapper[4816]: I0316 00:16:01.775331 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jcgw" event={"ID":"249ae30f-a698-43f3-9464-24868dff2ad6","Type":"ContainerDied","Data":"57ff0505ecaf816a1504b57a15468a94a9a38d0dae1d76628212d7c0e0c8e261"} Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.783286 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerStarted","Data":"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15"} Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.786852 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jcgw" event={"ID":"249ae30f-a698-43f3-9464-24868dff2ad6","Type":"ContainerStarted","Data":"29a2a6234324111f47ddb9745f9b567e930dbd3b8bac5490a80925c5a3ec4d8d"} Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.788885 4816 generic.go:334] "Generic (PLEG): container finished" podID="b478a542-14c7-4cca-9f95-64766b34df27" containerID="185e1a33c845773d7893f16759f110b3a4a2b357c62cdafa5e5060cabc62a64e" exitCode=0 Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.788930 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-fncq8" event={"ID":"b478a542-14c7-4cca-9f95-64766b34df27","Type":"ContainerDied","Data":"185e1a33c845773d7893f16759f110b3a4a2b357c62cdafa5e5060cabc62a64e"} Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.804639 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6z2gx" podStartSLOduration=2.105553273 podStartE2EDuration="4.804622079s" podCreationTimestamp="2026-03-16 00:15:58 +0000 UTC" firstStartedPulling="2026-03-16 00:15:59.740560491 +0000 UTC m=+552.836860444" lastFinishedPulling="2026-03-16 00:16:02.439629297 +0000 UTC m=+555.535929250" observedRunningTime="2026-03-16 00:16:02.801489017 +0000 UTC m=+555.897788970" watchObservedRunningTime="2026-03-16 00:16:02.804622079 +0000 UTC m=+555.900922032" Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.821354 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8jcgw" podStartSLOduration=3.01836504 podStartE2EDuration="5.821334138s" podCreationTimestamp="2026-03-16 00:15:57 +0000 UTC" firstStartedPulling="2026-03-16 00:15:59.752770298 +0000 UTC m=+552.849070251" lastFinishedPulling="2026-03-16 00:16:02.555739396 +0000 UTC m=+555.652039349" observedRunningTime="2026-03-16 00:16:02.820564626 +0000 UTC m=+555.916864589" watchObservedRunningTime="2026-03-16 00:16:02.821334138 +0000 UTC m=+555.917634101" Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.070567 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.178847 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcq2v\" (UniqueName: \"kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v\") pod \"b478a542-14c7-4cca-9f95-64766b34df27\" (UID: \"b478a542-14c7-4cca-9f95-64766b34df27\") " Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.184779 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v" (OuterVolumeSpecName: "kube-api-access-lcq2v") pod "b478a542-14c7-4cca-9f95-64766b34df27" (UID: "b478a542-14c7-4cca-9f95-64766b34df27"). InnerVolumeSpecName "kube-api-access-lcq2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.280749 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcq2v\" (UniqueName: \"kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v\") on node \"crc\" DevicePath \"\"" Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.802424 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-fncq8" event={"ID":"b478a542-14c7-4cca-9f95-64766b34df27","Type":"ContainerDied","Data":"f07774f669536f75976daaac2514712b31c1e4c589e53aab8a4efeae7ad978ba"} Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.802469 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f07774f669536f75976daaac2514712b31c1e4c589e53aab8a4efeae7ad978ba" Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.802473 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.124212 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-44pts"] Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.127926 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-44pts"] Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.675606 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e76e8f-7d69-4f55-81f8-45c9c612876b" path="/var/lib/kubelet/pods/55e76e8f-7d69-4f55-81f8-45c9c612876b/volumes" Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.865526 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.865581 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.908397 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:16:06 crc kubenswrapper[4816]: I0316 00:16:06.481868 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:16:06 crc kubenswrapper[4816]: I0316 00:16:06.482254 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:16:06 crc kubenswrapper[4816]: I0316 00:16:06.867279 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:16:07 crc kubenswrapper[4816]: I0316 00:16:07.523051 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hmzx7" podUID="6df1dc3a-6abd-4ffc-b27b-e66f281ed273" containerName="registry-server" probeResult="failure" output=< Mar 16 00:16:07 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:16:07 crc kubenswrapper[4816]: > Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.288177 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.288232 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.324220 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.876676 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.876751 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.899049 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.937193 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:16:09 crc kubenswrapper[4816]: I0316 00:16:09.904149 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:16:16 crc kubenswrapper[4816]: I0316 00:16:16.540361 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:16:16 crc kubenswrapper[4816]: I0316 00:16:16.582613 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:17:03 crc kubenswrapper[4816]: I0316 00:17:03.255048 4816 scope.go:117] "RemoveContainer" containerID="a0546877ac51e8fef907f2152b03530a1aaadfb1ec0bb2cad119c19beb5651ba" Mar 16 00:17:03 crc kubenswrapper[4816]: I0316 00:17:03.302580 4816 scope.go:117] "RemoveContainer" containerID="5259cd97d29c896bcf8ba7141fe44641e990295b28288f54dfe4315de536ad23" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.139181 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560338-8bkf9"] Mar 16 00:18:00 crc kubenswrapper[4816]: E0316 00:18:00.140026 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b478a542-14c7-4cca-9f95-64766b34df27" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.140044 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b478a542-14c7-4cca-9f95-64766b34df27" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.140196 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b478a542-14c7-4cca-9f95-64766b34df27" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.141047 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.145977 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.145977 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.146053 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.146475 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-8bkf9"] Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.261056 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2gzk\" (UniqueName: \"kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk\") pod \"auto-csr-approver-29560338-8bkf9\" (UID: \"6cfda38e-dbdc-4b42-8a0d-964103ee01cd\") " pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.361826 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2gzk\" (UniqueName: \"kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk\") pod \"auto-csr-approver-29560338-8bkf9\" (UID: \"6cfda38e-dbdc-4b42-8a0d-964103ee01cd\") " pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.396093 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2gzk\" (UniqueName: \"kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk\") pod \"auto-csr-approver-29560338-8bkf9\" (UID: \"6cfda38e-dbdc-4b42-8a0d-964103ee01cd\") " pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.512093 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.729331 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-8bkf9"] Mar 16 00:18:01 crc kubenswrapper[4816]: I0316 00:18:01.557295 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" event={"ID":"6cfda38e-dbdc-4b42-8a0d-964103ee01cd","Type":"ContainerStarted","Data":"c3e79faa65c4ca4a97231baa5de42757fa1bf5ee8bd498027cd3d986320c200c"} Mar 16 00:18:01 crc kubenswrapper[4816]: I0316 00:18:01.863614 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:18:01 crc kubenswrapper[4816]: I0316 00:18:01.863686 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:18:02 crc kubenswrapper[4816]: I0316 00:18:02.566385 4816 generic.go:334] "Generic (PLEG): container finished" podID="6cfda38e-dbdc-4b42-8a0d-964103ee01cd" containerID="b862cec0bd3d63e5c9dfe4071f9f4f3cb758b083bc3f73a5460bc03b5c4debd8" exitCode=0 Mar 16 00:18:02 crc kubenswrapper[4816]: I0316 00:18:02.566468 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" event={"ID":"6cfda38e-dbdc-4b42-8a0d-964103ee01cd","Type":"ContainerDied","Data":"b862cec0bd3d63e5c9dfe4071f9f4f3cb758b083bc3f73a5460bc03b5c4debd8"} Mar 16 00:18:03 crc kubenswrapper[4816]: I0316 00:18:03.763532 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:03 crc kubenswrapper[4816]: I0316 00:18:03.909660 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2gzk\" (UniqueName: \"kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk\") pod \"6cfda38e-dbdc-4b42-8a0d-964103ee01cd\" (UID: \"6cfda38e-dbdc-4b42-8a0d-964103ee01cd\") " Mar 16 00:18:03 crc kubenswrapper[4816]: I0316 00:18:03.916229 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk" (OuterVolumeSpecName: "kube-api-access-p2gzk") pod "6cfda38e-dbdc-4b42-8a0d-964103ee01cd" (UID: "6cfda38e-dbdc-4b42-8a0d-964103ee01cd"). InnerVolumeSpecName "kube-api-access-p2gzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.012191 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2gzk\" (UniqueName: \"kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk\") on node \"crc\" DevicePath \"\"" Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.578200 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" event={"ID":"6cfda38e-dbdc-4b42-8a0d-964103ee01cd","Type":"ContainerDied","Data":"c3e79faa65c4ca4a97231baa5de42757fa1bf5ee8bd498027cd3d986320c200c"} Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.578495 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3e79faa65c4ca4a97231baa5de42757fa1bf5ee8bd498027cd3d986320c200c" Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.578263 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.818892 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-wb8kg"] Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.828991 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-wb8kg"] Mar 16 00:18:05 crc kubenswrapper[4816]: I0316 00:18:05.676914 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e570fb38-3e4c-4b9b-82d9-878ec6a5306f" path="/var/lib/kubelet/pods/e570fb38-3e4c-4b9b-82d9-878ec6a5306f/volumes" Mar 16 00:18:31 crc kubenswrapper[4816]: I0316 00:18:31.863750 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:18:31 crc kubenswrapper[4816]: I0316 00:18:31.864213 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:19:01 crc kubenswrapper[4816]: I0316 00:19:01.863667 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:19:01 crc kubenswrapper[4816]: I0316 00:19:01.864366 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:19:01 crc kubenswrapper[4816]: I0316 00:19:01.864431 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:19:01 crc kubenswrapper[4816]: I0316 00:19:01.865427 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:19:01 crc kubenswrapper[4816]: I0316 00:19:01.865515 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4" gracePeriod=600 Mar 16 00:19:02 crc kubenswrapper[4816]: I0316 00:19:02.947621 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4" exitCode=0 Mar 16 00:19:02 crc kubenswrapper[4816]: I0316 00:19:02.947696 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4"} Mar 16 00:19:02 crc kubenswrapper[4816]: I0316 00:19:02.947749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad"} Mar 16 00:19:02 crc kubenswrapper[4816]: I0316 00:19:02.947781 4816 scope.go:117] "RemoveContainer" containerID="8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b" Mar 16 00:19:03 crc kubenswrapper[4816]: I0316 00:19:03.361666 4816 scope.go:117] "RemoveContainer" containerID="9cbc70d2e0b275d40fbacb6be14712c60796f46bdd73e4f108a004a37c120cb9" Mar 16 00:19:03 crc kubenswrapper[4816]: I0316 00:19:03.375362 4816 scope.go:117] "RemoveContainer" containerID="92ce11f74b2381302bcae2babd96b3eab76e1d28bfb034c70d8b99be8178dac1" Mar 16 00:19:03 crc kubenswrapper[4816]: I0316 00:19:03.393839 4816 scope.go:117] "RemoveContainer" containerID="c422afc027f6d729cf317777cce7cb5de5ed92334512743c933f67e04e4724ef" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.143562 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560340-pmlmw"] Mar 16 00:20:00 crc kubenswrapper[4816]: E0316 00:20:00.145944 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfda38e-dbdc-4b42-8a0d-964103ee01cd" containerName="oc" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.146037 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfda38e-dbdc-4b42-8a0d-964103ee01cd" containerName="oc" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.146224 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfda38e-dbdc-4b42-8a0d-964103ee01cd" containerName="oc" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.146825 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.150487 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.150522 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.152030 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.163685 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-pmlmw"] Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.172644 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lqg\" (UniqueName: \"kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg\") pod \"auto-csr-approver-29560340-pmlmw\" (UID: \"dc958138-2767-4d7a-8f61-bd16b899189f\") " pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.273462 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26lqg\" (UniqueName: \"kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg\") pod \"auto-csr-approver-29560340-pmlmw\" (UID: \"dc958138-2767-4d7a-8f61-bd16b899189f\") " pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.300744 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lqg\" (UniqueName: \"kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg\") pod \"auto-csr-approver-29560340-pmlmw\" (UID: \"dc958138-2767-4d7a-8f61-bd16b899189f\") " pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.465857 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.685441 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-pmlmw"] Mar 16 00:20:01 crc kubenswrapper[4816]: I0316 00:20:01.316926 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" event={"ID":"dc958138-2767-4d7a-8f61-bd16b899189f","Type":"ContainerStarted","Data":"f4329218a78829e970d0cd09947abdb25a1eb256ae427623608fcb446c86f8f3"} Mar 16 00:20:02 crc kubenswrapper[4816]: I0316 00:20:02.323166 4816 generic.go:334] "Generic (PLEG): container finished" podID="dc958138-2767-4d7a-8f61-bd16b899189f" containerID="4565949d11f1fa384d67b3420395f0c07c9d2ee22190f1a94b2e1bc9e4c10a96" exitCode=0 Mar 16 00:20:02 crc kubenswrapper[4816]: I0316 00:20:02.323387 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" event={"ID":"dc958138-2767-4d7a-8f61-bd16b899189f","Type":"ContainerDied","Data":"4565949d11f1fa384d67b3420395f0c07c9d2ee22190f1a94b2e1bc9e4c10a96"} Mar 16 00:20:03 crc kubenswrapper[4816]: I0316 00:20:03.547323 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:03 crc kubenswrapper[4816]: I0316 00:20:03.731836 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26lqg\" (UniqueName: \"kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg\") pod \"dc958138-2767-4d7a-8f61-bd16b899189f\" (UID: \"dc958138-2767-4d7a-8f61-bd16b899189f\") " Mar 16 00:20:03 crc kubenswrapper[4816]: I0316 00:20:03.739736 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg" (OuterVolumeSpecName: "kube-api-access-26lqg") pod "dc958138-2767-4d7a-8f61-bd16b899189f" (UID: "dc958138-2767-4d7a-8f61-bd16b899189f"). InnerVolumeSpecName "kube-api-access-26lqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:03 crc kubenswrapper[4816]: I0316 00:20:03.833153 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26lqg\" (UniqueName: \"kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:04 crc kubenswrapper[4816]: I0316 00:20:04.337450 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" event={"ID":"dc958138-2767-4d7a-8f61-bd16b899189f","Type":"ContainerDied","Data":"f4329218a78829e970d0cd09947abdb25a1eb256ae427623608fcb446c86f8f3"} Mar 16 00:20:04 crc kubenswrapper[4816]: I0316 00:20:04.337913 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4329218a78829e970d0cd09947abdb25a1eb256ae427623608fcb446c86f8f3" Mar 16 00:20:04 crc kubenswrapper[4816]: I0316 00:20:04.337497 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:04 crc kubenswrapper[4816]: I0316 00:20:04.607810 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-7sx8j"] Mar 16 00:20:04 crc kubenswrapper[4816]: I0316 00:20:04.611201 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-7sx8j"] Mar 16 00:20:05 crc kubenswrapper[4816]: I0316 00:20:05.679795 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5160d394-3d9b-4066-9bea-b9dd787b2a42" path="/var/lib/kubelet/pods/5160d394-3d9b-4066-9bea-b9dd787b2a42/volumes" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.784301 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lslqp"] Mar 16 00:20:11 crc kubenswrapper[4816]: E0316 00:20:11.785808 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc958138-2767-4d7a-8f61-bd16b899189f" containerName="oc" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.785911 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc958138-2767-4d7a-8f61-bd16b899189f" containerName="oc" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.786109 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc958138-2767-4d7a-8f61-bd16b899189f" containerName="oc" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.786678 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.801523 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lslqp"] Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.968811 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c7a56fa5-2504-4cbc-87c9-769b6c88b362-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.968898 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-bound-sa-token\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.968922 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-certificates\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.968943 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-trusted-ca\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.968989 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.969016 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c7a56fa5-2504-4cbc-87c9-769b6c88b362-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.969041 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-tls\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.969062 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx29s\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-kube-api-access-jx29s\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.992827 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.070318 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c7a56fa5-2504-4cbc-87c9-769b6c88b362-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.070793 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-bound-sa-token\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.070921 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-certificates\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.071050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-trusted-ca\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.071209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c7a56fa5-2504-4cbc-87c9-769b6c88b362-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.071327 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-tls\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.071435 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx29s\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-kube-api-access-jx29s\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.071793 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c7a56fa5-2504-4cbc-87c9-769b6c88b362-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.072601 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-trusted-ca\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.072675 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-certificates\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.075893 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-tls\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.078878 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c7a56fa5-2504-4cbc-87c9-769b6c88b362-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.091732 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx29s\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-kube-api-access-jx29s\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.093148 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-bound-sa-token\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.102226 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.335338 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lslqp"] Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.392103 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" event={"ID":"c7a56fa5-2504-4cbc-87c9-769b6c88b362","Type":"ContainerStarted","Data":"1b866e033d39f36c3f3be137aec614f6b7183066771a995c5186d2ace40ccf4a"} Mar 16 00:20:13 crc kubenswrapper[4816]: I0316 00:20:13.401148 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" event={"ID":"c7a56fa5-2504-4cbc-87c9-769b6c88b362","Type":"ContainerStarted","Data":"e0fdd0c14a8a6704265bba7a35d3d797c6de6599bb0a9e1fee5998e2a4d29135"} Mar 16 00:20:13 crc kubenswrapper[4816]: I0316 00:20:13.401439 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:13 crc kubenswrapper[4816]: I0316 00:20:13.431233 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" podStartSLOduration=2.431208724 podStartE2EDuration="2.431208724s" podCreationTimestamp="2026-03-16 00:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:20:13.426302361 +0000 UTC m=+806.522602314" watchObservedRunningTime="2026-03-16 00:20:13.431208724 +0000 UTC m=+806.527508697" Mar 16 00:20:32 crc kubenswrapper[4816]: I0316 00:20:32.111498 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:32 crc kubenswrapper[4816]: I0316 00:20:32.165702 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:20:48 crc kubenswrapper[4816]: I0316 00:20:48.426661 4816 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.212311 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" podUID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" containerName="registry" containerID="cri-o://4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b" gracePeriod=30 Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.586589 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.678646 4816 generic.go:334] "Generic (PLEG): container finished" podID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" containerID="4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b" exitCode=0 Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.678697 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" event={"ID":"b155133b-d494-44bc-aa5d-23efc7cbd7a6","Type":"ContainerDied","Data":"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b"} Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.678724 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" event={"ID":"b155133b-d494-44bc-aa5d-23efc7cbd7a6","Type":"ContainerDied","Data":"e368502f9ca177437add127848813e2ad33e96c185b8ab726042b2878dcec995"} Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.678745 4816 scope.go:117] "RemoveContainer" containerID="4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.678808 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.697032 4816 scope.go:117] "RemoveContainer" containerID="4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b" Mar 16 00:20:57 crc kubenswrapper[4816]: E0316 00:20:57.697815 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b\": container with ID starting with 4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b not found: ID does not exist" containerID="4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.697892 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b"} err="failed to get container status \"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b\": rpc error: code = NotFound desc = could not find container \"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b\": container with ID starting with 4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b not found: ID does not exist" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739125 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r7jk\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739199 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739230 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739285 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739321 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739360 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739404 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739615 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.742205 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.742758 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.747486 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.751272 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk" (OuterVolumeSpecName: "kube-api-access-9r7jk") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "kube-api-access-9r7jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.751763 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.751997 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.757482 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.761123 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841140 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841197 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841210 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r7jk\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841229 4816 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841250 4816 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841336 4816 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841352 4816 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:58 crc kubenswrapper[4816]: I0316 00:20:58.030206 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:20:58 crc kubenswrapper[4816]: I0316 00:20:58.042764 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:20:59 crc kubenswrapper[4816]: I0316 00:20:59.677347 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" path="/var/lib/kubelet/pods/b155133b-d494-44bc-aa5d-23efc7cbd7a6/volumes" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.399487 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-psjs7"] Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400443 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-controller" containerID="cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400540 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="nbdb" containerID="cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400613 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-acl-logging" containerID="cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400598 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400782 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="northd" containerID="cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400816 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="sbdb" containerID="cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400589 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-node" containerID="cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.449574 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" containerID="cri-o://7e5d87dc1889484bb2175c0613eda0b852c65a289a1c165f6adae2a822892aa2" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.707775 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/2.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.708303 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/1.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.708361 4816 generic.go:334] "Generic (PLEG): container finished" podID="e9789e58-12c8-4831-9401-af48a3e92209" containerID="707ec2df051aa6206ac2bc1c4db6b5fe6b37467b90b6ee42dbf28f2b88e5d6e6" exitCode=2 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.708465 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerDied","Data":"707ec2df051aa6206ac2bc1c4db6b5fe6b37467b90b6ee42dbf28f2b88e5d6e6"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.708508 4816 scope.go:117] "RemoveContainer" containerID="b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.709152 4816 scope.go:117] "RemoveContainer" containerID="707ec2df051aa6206ac2bc1c4db6b5fe6b37467b90b6ee42dbf28f2b88e5d6e6" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.713214 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/3.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.716289 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-acl-logging/0.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.717141 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-controller/0.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718415 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="7e5d87dc1889484bb2175c0613eda0b852c65a289a1c165f6adae2a822892aa2" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718455 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718463 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718470 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718477 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718483 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718489 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2" exitCode=143 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718496 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9" exitCode=143 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718506 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"7e5d87dc1889484bb2175c0613eda0b852c65a289a1c165f6adae2a822892aa2"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718581 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718600 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718616 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718631 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718645 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718660 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718675 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718688 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718702 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.740232 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/3.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.744411 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-acl-logging/0.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.745070 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-controller/0.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.745701 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.746434 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806406 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8h7qh"] Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806685 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806710 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806723 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-acl-logging" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806733 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-acl-logging" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806749 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-node" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806758 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-node" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806769 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="sbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806777 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="sbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806788 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="northd" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806795 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="northd" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806808 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806817 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806828 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806837 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806852 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" containerName="registry" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806861 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" containerName="registry" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806873 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="nbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806881 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="nbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806893 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kubecfg-setup" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806903 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kubecfg-setup" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806914 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806922 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806932 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806940 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806955 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806964 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807085 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="northd" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807100 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-node" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807110 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807119 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807129 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807138 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="sbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807150 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807162 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="nbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807169 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807182 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-acl-logging" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807194 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" containerName="registry" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.807323 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807334 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807444 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807457 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.809537 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884031 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884098 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884133 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884163 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rd68\" (UniqueName: \"kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884187 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884207 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884250 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884270 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884290 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884312 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884336 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884366 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884393 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884418 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884440 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884484 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884573 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884596 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884617 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.885248 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.885753 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.885793 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.887971 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888667 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888682 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888723 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888730 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash" (OuterVolumeSpecName: "host-slash") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888874 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888897 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket" (OuterVolumeSpecName: "log-socket") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888878 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log" (OuterVolumeSpecName: "node-log") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888918 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888949 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.889028 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.889182 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.889279 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.891928 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68" (OuterVolumeSpecName: "kube-api-access-9rd68") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "kube-api-access-9rd68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.894215 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.911425 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.985903 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-systemd-units\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986049 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-env-overrides\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986134 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-var-lib-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986185 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-log-socket\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986219 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-script-lib\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986304 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/842bb112-f402-4717-bc56-f488fc3c5db7-ovn-node-metrics-cert\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986342 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-systemd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986374 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qpl2\" (UniqueName: \"kubernetes.io/projected/842bb112-f402-4717-bc56-f488fc3c5db7-kube-api-access-2qpl2\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986443 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-slash\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986509 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-bin\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986590 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-ovn\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986622 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986714 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-netns\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986743 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-etc-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986771 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-kubelet\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986846 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-config\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986888 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986923 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-netd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986954 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986998 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-node-log\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987091 4816 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987113 4816 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987134 4816 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987153 4816 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987170 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987186 4816 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987201 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rd68\" (UniqueName: \"kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987216 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987258 4816 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987274 4816 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987289 4816 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987323 4816 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987338 4816 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987352 4816 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987368 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987384 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987398 4816 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987413 4816 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987426 4816 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987441 4816 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089067 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-var-lib-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-log-socket\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-script-lib\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089263 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-var-lib-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089312 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-log-socket\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089343 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/842bb112-f402-4717-bc56-f488fc3c5db7-ovn-node-metrics-cert\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089409 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-systemd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089462 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qpl2\" (UniqueName: \"kubernetes.io/projected/842bb112-f402-4717-bc56-f488fc3c5db7-kube-api-access-2qpl2\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089503 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-slash\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089526 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-bin\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089567 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-ovn\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089623 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-bin\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089619 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-systemd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089673 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089654 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-ovn\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089632 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-slash\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089706 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089825 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-netns\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089870 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-etc-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089893 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-kubelet\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089923 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-config\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089950 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-netns\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089960 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089990 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090042 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-etc-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090065 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-netd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090096 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-kubelet\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090102 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-node-log\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-systemd-units\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090255 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-env-overrides\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090531 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090582 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-node-log\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090596 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-systemd-units\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090627 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-netd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090953 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-script-lib\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.091142 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-config\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.091231 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-env-overrides\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.094909 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/842bb112-f402-4717-bc56-f488fc3c5db7-ovn-node-metrics-cert\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.110355 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qpl2\" (UniqueName: \"kubernetes.io/projected/842bb112-f402-4717-bc56-f488fc3c5db7-kube-api-access-2qpl2\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.126015 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: W0316 00:21:01.150141 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842bb112_f402_4717_bc56_f488fc3c5db7.slice/crio-f8b857aaf91a1aa57b19fc29051ada8ec2a1a9b761c683ed3e9fa9e5d97497f5 WatchSource:0}: Error finding container f8b857aaf91a1aa57b19fc29051ada8ec2a1a9b761c683ed3e9fa9e5d97497f5: Status 404 returned error can't find the container with id f8b857aaf91a1aa57b19fc29051ada8ec2a1a9b761c683ed3e9fa9e5d97497f5 Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.728388 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-acl-logging/0.log" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.729447 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-controller/0.log" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.730201 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.731960 4816 generic.go:334] "Generic (PLEG): container finished" podID="842bb112-f402-4717-bc56-f488fc3c5db7" containerID="692279448e8a204929ac728470e76248b7a686ab449a7b870b91a40eb34e40de" exitCode=0 Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.732020 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerDied","Data":"692279448e8a204929ac728470e76248b7a686ab449a7b870b91a40eb34e40de"} Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.732041 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"f8b857aaf91a1aa57b19fc29051ada8ec2a1a9b761c683ed3e9fa9e5d97497f5"} Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.735086 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/2.log" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.735130 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerStarted","Data":"4bde4ed98c5f5d1c0d8946acfc8cc13121f014a1d939f0f14b6cd0165659d331"} Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.818898 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-psjs7"] Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.821987 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-psjs7"] Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.476230 4816 scope.go:117] "RemoveContainer" containerID="1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.492774 4816 scope.go:117] "RemoveContainer" containerID="4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.511354 4816 scope.go:117] "RemoveContainer" containerID="d0a220f8f08fc88ffdf56d37ec2ba1b59974be62f3a81d988b1462b4794a79a8" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.541221 4816 scope.go:117] "RemoveContainer" containerID="4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.563284 4816 scope.go:117] "RemoveContainer" containerID="f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.581353 4816 scope.go:117] "RemoveContainer" containerID="0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.602473 4816 scope.go:117] "RemoveContainer" containerID="aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.619626 4816 scope.go:117] "RemoveContainer" containerID="826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.632076 4816 scope.go:117] "RemoveContainer" containerID="86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.657041 4816 scope.go:117] "RemoveContainer" containerID="7e5d87dc1889484bb2175c0613eda0b852c65a289a1c165f6adae2a822892aa2" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.675662 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" path="/var/lib/kubelet/pods/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/volumes" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750781 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"ab1b71595d8967f55cac92c5aec109d46f21632056549e44c80fd734ab566962"} Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750825 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"a7f8583fc78de60d0291b7007caac080e35b5233fc3c3df2e84a937546c476de"} Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750840 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"eadd6738e6ec7982f52dd38e85081dc5f198ede817985659bad8d0a6b04e7d06"} Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750866 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"ed9254fe0b52cf509ff910fae8230bfbb9e723a664c7eef81c0fb7062492d7a8"} Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750878 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"c6ff289266bc8c827c219dad4dce737231a4d8d8fa08cdd4e0574caa885f065c"} Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750889 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"5ea9560161ac08ce1fb2fa03da7fc937a29a00f0220b6b4f6421faa145d093b5"} Mar 16 00:21:06 crc kubenswrapper[4816]: I0316 00:21:06.773996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"af5edc529512fefb0161b3b8a090063a6891a4dc66e4e9f4a0046f57b5544748"} Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.791011 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"1a51743713484f00c09cc3c0386f957872cc2f2fbe6db30d66224ab6a7bdabd5"} Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.791342 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.791356 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.791364 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.820129 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.821887 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.825943 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" podStartSLOduration=8.825925989 podStartE2EDuration="8.825925989s" podCreationTimestamp="2026-03-16 00:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:21:08.822866289 +0000 UTC m=+861.919166292" watchObservedRunningTime="2026-03-16 00:21:08.825925989 +0000 UTC m=+861.922225942" Mar 16 00:21:31 crc kubenswrapper[4816]: I0316 00:21:31.149923 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:31 crc kubenswrapper[4816]: I0316 00:21:31.864013 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:21:31 crc kubenswrapper[4816]: I0316 00:21:31.864377 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.137832 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560342-qq7qg"] Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.141651 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.143988 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.144151 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.144177 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.144573 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-qq7qg"] Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.276629 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vckw\" (UniqueName: \"kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw\") pod \"auto-csr-approver-29560342-qq7qg\" (UID: \"d60f1a00-e9c6-46ff-b5eb-f3c680f04736\") " pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.378440 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vckw\" (UniqueName: \"kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw\") pod \"auto-csr-approver-29560342-qq7qg\" (UID: \"d60f1a00-e9c6-46ff-b5eb-f3c680f04736\") " pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.400391 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vckw\" (UniqueName: \"kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw\") pod \"auto-csr-approver-29560342-qq7qg\" (UID: \"d60f1a00-e9c6-46ff-b5eb-f3c680f04736\") " pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.461203 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:01 crc kubenswrapper[4816]: I0316 00:22:00.662902 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-qq7qg"] Mar 16 00:22:01 crc kubenswrapper[4816]: I0316 00:22:00.669102 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:22:01 crc kubenswrapper[4816]: I0316 00:22:01.093174 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" event={"ID":"d60f1a00-e9c6-46ff-b5eb-f3c680f04736","Type":"ContainerStarted","Data":"4f4e0cce66b2e8f404303d0d5f05b8d4e1ec1593cfc9384ac6f5c65a0d46c71e"} Mar 16 00:22:01 crc kubenswrapper[4816]: I0316 00:22:01.863149 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:22:01 crc kubenswrapper[4816]: I0316 00:22:01.863222 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:22:02 crc kubenswrapper[4816]: I0316 00:22:02.100167 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" event={"ID":"d60f1a00-e9c6-46ff-b5eb-f3c680f04736","Type":"ContainerStarted","Data":"dbd7c0bfa602e132787d7d6d843e255ebdb6acf34354466437ff4e5db80a17a7"} Mar 16 00:22:02 crc kubenswrapper[4816]: I0316 00:22:02.120431 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" podStartSLOduration=1.047352211 podStartE2EDuration="2.120411942s" podCreationTimestamp="2026-03-16 00:22:00 +0000 UTC" firstStartedPulling="2026-03-16 00:22:00.668921977 +0000 UTC m=+913.765221930" lastFinishedPulling="2026-03-16 00:22:01.741981698 +0000 UTC m=+914.838281661" observedRunningTime="2026-03-16 00:22:02.118174407 +0000 UTC m=+915.214474370" watchObservedRunningTime="2026-03-16 00:22:02.120411942 +0000 UTC m=+915.216711895" Mar 16 00:22:03 crc kubenswrapper[4816]: I0316 00:22:03.109771 4816 generic.go:334] "Generic (PLEG): container finished" podID="d60f1a00-e9c6-46ff-b5eb-f3c680f04736" containerID="dbd7c0bfa602e132787d7d6d843e255ebdb6acf34354466437ff4e5db80a17a7" exitCode=0 Mar 16 00:22:03 crc kubenswrapper[4816]: I0316 00:22:03.109853 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" event={"ID":"d60f1a00-e9c6-46ff-b5eb-f3c680f04736","Type":"ContainerDied","Data":"dbd7c0bfa602e132787d7d6d843e255ebdb6acf34354466437ff4e5db80a17a7"} Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.351523 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.432267 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vckw\" (UniqueName: \"kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw\") pod \"d60f1a00-e9c6-46ff-b5eb-f3c680f04736\" (UID: \"d60f1a00-e9c6-46ff-b5eb-f3c680f04736\") " Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.442350 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw" (OuterVolumeSpecName: "kube-api-access-6vckw") pod "d60f1a00-e9c6-46ff-b5eb-f3c680f04736" (UID: "d60f1a00-e9c6-46ff-b5eb-f3c680f04736"). InnerVolumeSpecName "kube-api-access-6vckw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.459036 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.459291 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nvgvc" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="registry-server" containerID="cri-o://5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a" gracePeriod=30 Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.534104 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vckw\" (UniqueName: \"kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.760220 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.836986 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content\") pod \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.837032 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities\") pod \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.837055 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh6xk\" (UniqueName: \"kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk\") pod \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.838900 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities" (OuterVolumeSpecName: "utilities") pod "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" (UID: "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.841431 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk" (OuterVolumeSpecName: "kube-api-access-qh6xk") pod "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" (UID: "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa"). InnerVolumeSpecName "kube-api-access-qh6xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.863490 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" (UID: "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.938963 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.939004 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.939017 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh6xk\" (UniqueName: \"kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.122618 4816 generic.go:334] "Generic (PLEG): container finished" podID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerID="5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a" exitCode=0 Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.122668 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.122692 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerDied","Data":"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a"} Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.122737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerDied","Data":"3956bdc0939ca6c80a18b82143c55a4cfebb9af362a0d61193b0fe36b4f051bd"} Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.122759 4816 scope.go:117] "RemoveContainer" containerID="5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.123996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" event={"ID":"d60f1a00-e9c6-46ff-b5eb-f3c680f04736","Type":"ContainerDied","Data":"4f4e0cce66b2e8f404303d0d5f05b8d4e1ec1593cfc9384ac6f5c65a0d46c71e"} Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.124027 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f4e0cce66b2e8f404303d0d5f05b8d4e1ec1593cfc9384ac6f5c65a0d46c71e" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.124065 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.147088 4816 scope.go:117] "RemoveContainer" containerID="6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.175193 4816 scope.go:117] "RemoveContainer" containerID="4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.180653 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.184289 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.189221 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-fncq8"] Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.190971 4816 scope.go:117] "RemoveContainer" containerID="5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a" Mar 16 00:22:05 crc kubenswrapper[4816]: E0316 00:22:05.191413 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a\": container with ID starting with 5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a not found: ID does not exist" containerID="5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.191469 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a"} err="failed to get container status \"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a\": rpc error: code = NotFound desc = could not find container \"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a\": container with ID starting with 5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a not found: ID does not exist" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.191515 4816 scope.go:117] "RemoveContainer" containerID="6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64" Mar 16 00:22:05 crc kubenswrapper[4816]: E0316 00:22:05.191875 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64\": container with ID starting with 6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64 not found: ID does not exist" containerID="6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.192029 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64"} err="failed to get container status \"6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64\": rpc error: code = NotFound desc = could not find container \"6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64\": container with ID starting with 6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64 not found: ID does not exist" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.192096 4816 scope.go:117] "RemoveContainer" containerID="4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096" Mar 16 00:22:05 crc kubenswrapper[4816]: E0316 00:22:05.192584 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096\": container with ID starting with 4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096 not found: ID does not exist" containerID="4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.192615 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096"} err="failed to get container status \"4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096\": rpc error: code = NotFound desc = could not find container \"4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096\": container with ID starting with 4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096 not found: ID does not exist" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.193723 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-fncq8"] Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.673700 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" path="/var/lib/kubelet/pods/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa/volumes" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.674777 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b478a542-14c7-4cca-9f95-64766b34df27" path="/var/lib/kubelet/pods/b478a542-14c7-4cca-9f95-64766b34df27/volumes" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.431599 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4"] Mar 16 00:22:08 crc kubenswrapper[4816]: E0316 00:22:08.432111 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="extract-content" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432123 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="extract-content" Mar 16 00:22:08 crc kubenswrapper[4816]: E0316 00:22:08.432131 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="registry-server" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432137 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="registry-server" Mar 16 00:22:08 crc kubenswrapper[4816]: E0316 00:22:08.432146 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="extract-utilities" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432152 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="extract-utilities" Mar 16 00:22:08 crc kubenswrapper[4816]: E0316 00:22:08.432160 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60f1a00-e9c6-46ff-b5eb-f3c680f04736" containerName="oc" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432166 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60f1a00-e9c6-46ff-b5eb-f3c680f04736" containerName="oc" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432245 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="registry-server" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432256 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60f1a00-e9c6-46ff-b5eb-f3c680f04736" containerName="oc" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432968 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.436175 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.445252 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4"] Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.582776 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.582842 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqn6m\" (UniqueName: \"kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.582939 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.683700 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqn6m\" (UniqueName: \"kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.683803 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.683862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.684369 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.684524 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.706606 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqn6m\" (UniqueName: \"kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.747517 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.943299 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4"] Mar 16 00:22:09 crc kubenswrapper[4816]: I0316 00:22:09.154533 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerStarted","Data":"20fa11f8bf7bd536e4ee598e1c06d0e4f08ba15ea4cd82786ae11f1cff7ad5d5"} Mar 16 00:22:09 crc kubenswrapper[4816]: I0316 00:22:09.154633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerStarted","Data":"b2e666dffac4d36453ae492f3bd8ee6e9e91e9b6a649968caa1dcade8b909136"} Mar 16 00:22:10 crc kubenswrapper[4816]: I0316 00:22:10.161116 4816 generic.go:334] "Generic (PLEG): container finished" podID="35d36436-ca87-48ef-9a68-484c2335bb33" containerID="20fa11f8bf7bd536e4ee598e1c06d0e4f08ba15ea4cd82786ae11f1cff7ad5d5" exitCode=0 Mar 16 00:22:10 crc kubenswrapper[4816]: I0316 00:22:10.161193 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerDied","Data":"20fa11f8bf7bd536e4ee598e1c06d0e4f08ba15ea4cd82786ae11f1cff7ad5d5"} Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.381063 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.382876 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.389012 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.520434 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.520513 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.520612 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44s4s\" (UniqueName: \"kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.622242 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44s4s\" (UniqueName: \"kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.622838 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.622895 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.623381 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.623515 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.643203 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44s4s\" (UniqueName: \"kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.723089 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.917034 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:12 crc kubenswrapper[4816]: I0316 00:22:12.173511 4816 generic.go:334] "Generic (PLEG): container finished" podID="449b2c21-4396-4d46-af73-e670b282f831" containerID="1891eef8092d3a2994e7d2d76188c04ec2abc90123f32af569269e31014fdb64" exitCode=0 Mar 16 00:22:12 crc kubenswrapper[4816]: I0316 00:22:12.173704 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerDied","Data":"1891eef8092d3a2994e7d2d76188c04ec2abc90123f32af569269e31014fdb64"} Mar 16 00:22:12 crc kubenswrapper[4816]: I0316 00:22:12.173904 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerStarted","Data":"d18b65047ef76f8aa8382e87295979d6bc4c30a17f5c2e1a21242e76f4c64c95"} Mar 16 00:22:12 crc kubenswrapper[4816]: I0316 00:22:12.175601 4816 generic.go:334] "Generic (PLEG): container finished" podID="35d36436-ca87-48ef-9a68-484c2335bb33" containerID="666c84ff5793be20b033df37aa373da029267c23d1648b80c2fb628543806f79" exitCode=0 Mar 16 00:22:12 crc kubenswrapper[4816]: I0316 00:22:12.175637 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerDied","Data":"666c84ff5793be20b033df37aa373da029267c23d1648b80c2fb628543806f79"} Mar 16 00:22:13 crc kubenswrapper[4816]: I0316 00:22:13.183794 4816 generic.go:334] "Generic (PLEG): container finished" podID="35d36436-ca87-48ef-9a68-484c2335bb33" containerID="7656017a9f30cdee825f1f174f1a6b26741bc9f989f7fb768e5f61a6979f53e4" exitCode=0 Mar 16 00:22:13 crc kubenswrapper[4816]: I0316 00:22:13.183847 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerDied","Data":"7656017a9f30cdee825f1f174f1a6b26741bc9f989f7fb768e5f61a6979f53e4"} Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.193609 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerStarted","Data":"6675088a9bc77bc6e858928c42b983a7c8adb51e5f6f5dc372c160b6c32fce59"} Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.463065 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.635507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util\") pod \"35d36436-ca87-48ef-9a68-484c2335bb33\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.635722 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqn6m\" (UniqueName: \"kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m\") pod \"35d36436-ca87-48ef-9a68-484c2335bb33\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.635801 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle\") pod \"35d36436-ca87-48ef-9a68-484c2335bb33\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.639032 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle" (OuterVolumeSpecName: "bundle") pod "35d36436-ca87-48ef-9a68-484c2335bb33" (UID: "35d36436-ca87-48ef-9a68-484c2335bb33"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.643714 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m" (OuterVolumeSpecName: "kube-api-access-wqn6m") pod "35d36436-ca87-48ef-9a68-484c2335bb33" (UID: "35d36436-ca87-48ef-9a68-484c2335bb33"). InnerVolumeSpecName "kube-api-access-wqn6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.657292 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util" (OuterVolumeSpecName: "util") pod "35d36436-ca87-48ef-9a68-484c2335bb33" (UID: "35d36436-ca87-48ef-9a68-484c2335bb33"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.737221 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.737250 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.737259 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqn6m\" (UniqueName: \"kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.202921 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerDied","Data":"b2e666dffac4d36453ae492f3bd8ee6e9e91e9b6a649968caa1dcade8b909136"} Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.203232 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2e666dffac4d36453ae492f3bd8ee6e9e91e9b6a649968caa1dcade8b909136" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.203006 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.207184 4816 generic.go:334] "Generic (PLEG): container finished" podID="449b2c21-4396-4d46-af73-e670b282f831" containerID="6675088a9bc77bc6e858928c42b983a7c8adb51e5f6f5dc372c160b6c32fce59" exitCode=0 Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.207233 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerDied","Data":"6675088a9bc77bc6e858928c42b983a7c8adb51e5f6f5dc372c160b6c32fce59"} Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.219352 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z"] Mar 16 00:22:15 crc kubenswrapper[4816]: E0316 00:22:15.219707 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="util" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.219735 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="util" Mar 16 00:22:15 crc kubenswrapper[4816]: E0316 00:22:15.219758 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="extract" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.219767 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="extract" Mar 16 00:22:15 crc kubenswrapper[4816]: E0316 00:22:15.219780 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="pull" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.219788 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="pull" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.219914 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="extract" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.221066 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.223012 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.229421 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z"] Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.343531 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.343882 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2nxj\" (UniqueName: \"kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.343920 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.444624 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2nxj\" (UniqueName: \"kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.444826 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.444903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.445610 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.445754 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.471688 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2nxj\" (UniqueName: \"kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.539592 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.756703 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z"] Mar 16 00:22:15 crc kubenswrapper[4816]: W0316 00:22:15.760613 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e8bb6e1_f8fd_4484_ba21_a2d5f80f0d1c.slice/crio-37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98 WatchSource:0}: Error finding container 37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98: Status 404 returned error can't find the container with id 37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98 Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.219985 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerStarted","Data":"408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861"} Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.222798 4816 generic.go:334] "Generic (PLEG): container finished" podID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerID="5e2ebc35c72ed41a7d716bbf1b0af7bffa3e1ec2eac67adc6dcfa6135e61c059" exitCode=0 Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.222844 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" event={"ID":"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c","Type":"ContainerDied","Data":"5e2ebc35c72ed41a7d716bbf1b0af7bffa3e1ec2eac67adc6dcfa6135e61c059"} Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.222871 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" event={"ID":"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c","Type":"ContainerStarted","Data":"37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98"} Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.233500 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9"] Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.238140 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.245187 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9"] Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.250196 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-284nb" podStartSLOduration=1.549956339 podStartE2EDuration="5.250167108s" podCreationTimestamp="2026-03-16 00:22:11 +0000 UTC" firstStartedPulling="2026-03-16 00:22:12.176427476 +0000 UTC m=+925.272727429" lastFinishedPulling="2026-03-16 00:22:15.876638245 +0000 UTC m=+928.972938198" observedRunningTime="2026-03-16 00:22:16.239884422 +0000 UTC m=+929.336184415" watchObservedRunningTime="2026-03-16 00:22:16.250167108 +0000 UTC m=+929.346467101" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.358992 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.359106 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qp2v\" (UniqueName: \"kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.359177 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.460404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qp2v\" (UniqueName: \"kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.460493 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.460536 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.461109 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.461373 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.484041 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qp2v\" (UniqueName: \"kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.555349 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.734932 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9"] Mar 16 00:22:17 crc kubenswrapper[4816]: I0316 00:22:17.232894 4816 generic.go:334] "Generic (PLEG): container finished" podID="43895212-4bba-4d69-b3eb-10f49e771de3" containerID="25e48dfc32cfb5fae27e0f685f011357f858ff603b6c0a57b3df0dd8b97c8d60" exitCode=0 Mar 16 00:22:17 crc kubenswrapper[4816]: I0316 00:22:17.233007 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" event={"ID":"43895212-4bba-4d69-b3eb-10f49e771de3","Type":"ContainerDied","Data":"25e48dfc32cfb5fae27e0f685f011357f858ff603b6c0a57b3df0dd8b97c8d60"} Mar 16 00:22:17 crc kubenswrapper[4816]: I0316 00:22:17.233214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" event={"ID":"43895212-4bba-4d69-b3eb-10f49e771de3","Type":"ContainerStarted","Data":"bc98dd0678141051e979632a0423832d0f036d7e8d23226dff5b4e233bb6610e"} Mar 16 00:22:17 crc kubenswrapper[4816]: I0316 00:22:17.237997 4816 generic.go:334] "Generic (PLEG): container finished" podID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerID="273c9ff624d959a6aab76f887c0b36554d89ab051a50554133a0618a2408c4ff" exitCode=0 Mar 16 00:22:17 crc kubenswrapper[4816]: I0316 00:22:17.238106 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" event={"ID":"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c","Type":"ContainerDied","Data":"273c9ff624d959a6aab76f887c0b36554d89ab051a50554133a0618a2408c4ff"} Mar 16 00:22:18 crc kubenswrapper[4816]: I0316 00:22:18.246799 4816 generic.go:334] "Generic (PLEG): container finished" podID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerID="c7254f1bc611c01cc414758993eb7cab5b4e6b7c3657115ba8dad1b2b02641ea" exitCode=0 Mar 16 00:22:18 crc kubenswrapper[4816]: I0316 00:22:18.246854 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" event={"ID":"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c","Type":"ContainerDied","Data":"c7254f1bc611c01cc414758993eb7cab5b4e6b7c3657115ba8dad1b2b02641ea"} Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.257637 4816 generic.go:334] "Generic (PLEG): container finished" podID="43895212-4bba-4d69-b3eb-10f49e771de3" containerID="846f4dc07d75a6d38979bd0bcefc6e0f7956ffb9542062ed83347ce157270bfd" exitCode=0 Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.257740 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" event={"ID":"43895212-4bba-4d69-b3eb-10f49e771de3","Type":"ContainerDied","Data":"846f4dc07d75a6d38979bd0bcefc6e0f7956ffb9542062ed83347ce157270bfd"} Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.653760 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.803566 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2nxj\" (UniqueName: \"kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj\") pod \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.803638 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util\") pod \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.803671 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle\") pod \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.805362 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle" (OuterVolumeSpecName: "bundle") pod "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" (UID: "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.813700 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj" (OuterVolumeSpecName: "kube-api-access-v2nxj") pod "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" (UID: "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c"). InnerVolumeSpecName "kube-api-access-v2nxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.826821 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util" (OuterVolumeSpecName: "util") pod "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" (UID: "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.905498 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2nxj\" (UniqueName: \"kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.905533 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.905558 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.182014 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:20 crc kubenswrapper[4816]: E0316 00:22:20.182256 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="pull" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.182274 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="pull" Mar 16 00:22:20 crc kubenswrapper[4816]: E0316 00:22:20.182291 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="util" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.182298 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="util" Mar 16 00:22:20 crc kubenswrapper[4816]: E0316 00:22:20.182344 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="extract" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.182353 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="extract" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.182472 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="extract" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.183421 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.237000 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.265059 4816 generic.go:334] "Generic (PLEG): container finished" podID="43895212-4bba-4d69-b3eb-10f49e771de3" containerID="9753285d470f349be7b55d609e2dfe0e40687af01b3d1e9210904b74dc0363d8" exitCode=0 Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.265120 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" event={"ID":"43895212-4bba-4d69-b3eb-10f49e771de3","Type":"ContainerDied","Data":"9753285d470f349be7b55d609e2dfe0e40687af01b3d1e9210904b74dc0363d8"} Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.267015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" event={"ID":"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c","Type":"ContainerDied","Data":"37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98"} Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.267038 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.267070 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.309739 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.309867 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62z8f\" (UniqueName: \"kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.309900 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.410676 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62z8f\" (UniqueName: \"kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.411279 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.411434 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.411765 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.411867 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.458252 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62z8f\" (UniqueName: \"kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.498210 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.023782 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:21 crc kubenswrapper[4816]: W0316 00:22:21.028468 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd502b2b7_9ca7_4b92_bb7f_d1639a6a7ef2.slice/crio-5750fa2beb8413e4bbf66bc2e4ec103cf698ec3743191fcfdc001a528f09e12f WatchSource:0}: Error finding container 5750fa2beb8413e4bbf66bc2e4ec103cf698ec3743191fcfdc001a528f09e12f: Status 404 returned error can't find the container with id 5750fa2beb8413e4bbf66bc2e4ec103cf698ec3743191fcfdc001a528f09e12f Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.273926 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerStarted","Data":"e5839a641f5e4b47edb9ba28f2918214558d3b4454dab06397a37f22bd1120b7"} Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.274269 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerStarted","Data":"5750fa2beb8413e4bbf66bc2e4ec103cf698ec3743191fcfdc001a528f09e12f"} Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.679565 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.724306 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.727773 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.827718 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qp2v\" (UniqueName: \"kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v\") pod \"43895212-4bba-4d69-b3eb-10f49e771de3\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.827992 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle\") pod \"43895212-4bba-4d69-b3eb-10f49e771de3\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.828130 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util\") pod \"43895212-4bba-4d69-b3eb-10f49e771de3\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.828863 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle" (OuterVolumeSpecName: "bundle") pod "43895212-4bba-4d69-b3eb-10f49e771de3" (UID: "43895212-4bba-4d69-b3eb-10f49e771de3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.835995 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v" (OuterVolumeSpecName: "kube-api-access-4qp2v") pod "43895212-4bba-4d69-b3eb-10f49e771de3" (UID: "43895212-4bba-4d69-b3eb-10f49e771de3"). InnerVolumeSpecName "kube-api-access-4qp2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.866517 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util" (OuterVolumeSpecName: "util") pod "43895212-4bba-4d69-b3eb-10f49e771de3" (UID: "43895212-4bba-4d69-b3eb-10f49e771de3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.929214 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qp2v\" (UniqueName: \"kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.929587 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.929601 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.282350 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.282339 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" event={"ID":"43895212-4bba-4d69-b3eb-10f49e771de3","Type":"ContainerDied","Data":"bc98dd0678141051e979632a0423832d0f036d7e8d23226dff5b4e233bb6610e"} Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.282471 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc98dd0678141051e979632a0423832d0f036d7e8d23226dff5b4e233bb6610e" Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.284160 4816 generic.go:334] "Generic (PLEG): container finished" podID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerID="e5839a641f5e4b47edb9ba28f2918214558d3b4454dab06397a37f22bd1120b7" exitCode=0 Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.284243 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerDied","Data":"e5839a641f5e4b47edb9ba28f2918214558d3b4454dab06397a37f22bd1120b7"} Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.780075 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-284nb" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" probeResult="failure" output=< Mar 16 00:22:22 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:22:22 crc kubenswrapper[4816]: > Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.295361 4816 generic.go:334] "Generic (PLEG): container finished" podID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerID="79acb07939ccc5b80b6a155b3eb5d1de07ee6d3e3a71e7de4d918cb8ef32d1ed" exitCode=0 Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.295469 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerDied","Data":"79acb07939ccc5b80b6a155b3eb5d1de07ee6d3e3a71e7de4d918cb8ef32d1ed"} Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.639085 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l"] Mar 16 00:22:24 crc kubenswrapper[4816]: E0316 00:22:24.639318 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="pull" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.639339 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="pull" Mar 16 00:22:24 crc kubenswrapper[4816]: E0316 00:22:24.639363 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="util" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.639372 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="util" Mar 16 00:22:24 crc kubenswrapper[4816]: E0316 00:22:24.639381 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="extract" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.639390 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="extract" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.639505 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="extract" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.640431 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.643191 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.650743 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l"] Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.762526 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.762602 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.762654 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4582c\" (UniqueName: \"kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.846221 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44"] Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.846834 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.849426 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.849888 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-ft6r8" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.851591 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.857374 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44"] Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.863659 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4582c\" (UniqueName: \"kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.863719 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.863765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.864273 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.864392 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.896429 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4582c\" (UniqueName: \"kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.952833 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.965431 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5nfp\" (UniqueName: \"kubernetes.io/projected/562f24fe-5c4c-4540-96ae-6e01f539141b-kube-api-access-x5nfp\") pod \"obo-prometheus-operator-68bc856cb9-tfv44\" (UID: \"562f24fe-5c4c-4540-96ae-6e01f539141b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.986837 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk"] Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.987485 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.990819 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-wqns4" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.991090 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.996958 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn"] Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.997654 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.004294 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.023651 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.066430 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5nfp\" (UniqueName: \"kubernetes.io/projected/562f24fe-5c4c-4540-96ae-6e01f539141b-kube-api-access-x5nfp\") pod \"obo-prometheus-operator-68bc856cb9-tfv44\" (UID: \"562f24fe-5c4c-4540-96ae-6e01f539141b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.089625 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5nfp\" (UniqueName: \"kubernetes.io/projected/562f24fe-5c4c-4540-96ae-6e01f539141b-kube-api-access-x5nfp\") pod \"obo-prometheus-operator-68bc856cb9-tfv44\" (UID: \"562f24fe-5c4c-4540-96ae-6e01f539141b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.162322 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.167517 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.167586 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.167614 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.167651 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.197154 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-w6wv7"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.197990 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.208753 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-m8rpm" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.208984 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.223009 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-w6wv7"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.271595 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.271681 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.271721 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.271768 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.280890 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.280912 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.280913 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.283694 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.296285 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.321768 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerStarted","Data":"f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a"} Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.347496 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.347577 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jjrq7" podStartSLOduration=2.930626549 podStartE2EDuration="5.347541967s" podCreationTimestamp="2026-03-16 00:22:20 +0000 UTC" firstStartedPulling="2026-03-16 00:22:22.286508067 +0000 UTC m=+935.382808020" lastFinishedPulling="2026-03-16 00:22:24.703423485 +0000 UTC m=+937.799723438" observedRunningTime="2026-03-16 00:22:25.344016846 +0000 UTC m=+938.440316799" watchObservedRunningTime="2026-03-16 00:22:25.347541967 +0000 UTC m=+938.443841920" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.361851 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.379241 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz5bq\" (UniqueName: \"kubernetes.io/projected/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-kube-api-access-mz5bq\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.379326 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-observability-operator-tls\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.396830 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-t7w7m"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.397717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.406537 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-mjv48" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.433259 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-t7w7m"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.486180 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz5bq\" (UniqueName: \"kubernetes.io/projected/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-kube-api-access-mz5bq\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.486282 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-observability-operator-tls\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.491194 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-observability-operator-tls\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.509317 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz5bq\" (UniqueName: \"kubernetes.io/projected/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-kube-api-access-mz5bq\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.545862 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.569044 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.589096 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.589179 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g45v\" (UniqueName: \"kubernetes.io/projected/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-kube-api-access-9g45v\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.689072 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.690205 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g45v\" (UniqueName: \"kubernetes.io/projected/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-kube-api-access-9g45v\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.690288 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.691499 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: W0316 00:22:25.714758 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a808114_3164_4abe_a481_1b5d3b9df2a0.slice/crio-b9994a1aa8a861ee26befc56024cc3fe17c6f7e7ffbbe13fd66ae13f48d231ff WatchSource:0}: Error finding container b9994a1aa8a861ee26befc56024cc3fe17c6f7e7ffbbe13fd66ae13f48d231ff: Status 404 returned error can't find the container with id b9994a1aa8a861ee26befc56024cc3fe17c6f7e7ffbbe13fd66ae13f48d231ff Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.715465 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g45v\" (UniqueName: \"kubernetes.io/projected/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-kube-api-access-9g45v\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.746416 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.899391 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-w6wv7"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.979309 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn"] Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.150469 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-t7w7m"] Mar 16 00:22:26 crc kubenswrapper[4816]: W0316 00:22:26.154344 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24959c1_f57f_4bf6_8a55_c8a35173ff8b.slice/crio-81b77c3544cb92b1b3ff3f953b9590e553e22ceb2885c94f32353351ab9051c7 WatchSource:0}: Error finding container 81b77c3544cb92b1b3ff3f953b9590e553e22ceb2885c94f32353351ab9051c7: Status 404 returned error can't find the container with id 81b77c3544cb92b1b3ff3f953b9590e553e22ceb2885c94f32353351ab9051c7 Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.329430 4816 generic.go:334] "Generic (PLEG): container finished" podID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerID="c8f418e6cc25e4a0e78eae89961561eed7258b5829ebb9c402f1e3fe0c654d54" exitCode=0 Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.329537 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" event={"ID":"1da45fda-a8cc-46c1-8831-58418ecc9819","Type":"ContainerDied","Data":"c8f418e6cc25e4a0e78eae89961561eed7258b5829ebb9c402f1e3fe0c654d54"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.329584 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" event={"ID":"1da45fda-a8cc-46c1-8831-58418ecc9819","Type":"ContainerStarted","Data":"423ca09a07f2da9396c84b4219be8387d28d6dd64d1f4c92b01055a8dae546ea"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.331137 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" event={"ID":"f24959c1-f57f-4bf6-8a55-c8a35173ff8b","Type":"ContainerStarted","Data":"81b77c3544cb92b1b3ff3f953b9590e553e22ceb2885c94f32353351ab9051c7"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.332502 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" event={"ID":"36951342-3370-4291-baa3-2612f64036fd","Type":"ContainerStarted","Data":"b1b87b98fc34f49cfda8b008ea32b014b1f814bc1cc1fbd9cf8085c1089ce8ad"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.334343 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" event={"ID":"8d0f60fa-8d26-43ea-a680-1d3a92dd270d","Type":"ContainerStarted","Data":"135ed522473def6429d8458aff130077f8733eb170e1395ff9e94fcf67d9cb23"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.335461 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" event={"ID":"562f24fe-5c4c-4540-96ae-6e01f539141b","Type":"ContainerStarted","Data":"4e5b9318c1f0f7a04ad87c88c6d75e4c42de00878e25b32bc150300ec987975e"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.336910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" event={"ID":"9a808114-3164-4abe-a481-1b5d3b9df2a0","Type":"ContainerStarted","Data":"b9994a1aa8a861ee26befc56024cc3fe17c6f7e7ffbbe13fd66ae13f48d231ff"} Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.499110 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.499636 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.611278 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.767050 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-8494b79c9c-fmbm9"] Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.767774 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.769831 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.770072 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.770623 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-xhl6t" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.770940 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.797317 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-8494b79c9c-fmbm9"] Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.878168 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-apiservice-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.878205 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-webhook-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.878239 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch7rt\" (UniqueName: \"kubernetes.io/projected/0c23dce2-a24c-4f57-9311-56675376c95e-kube-api-access-ch7rt\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.980441 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-apiservice-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.980487 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-webhook-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.980517 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch7rt\" (UniqueName: \"kubernetes.io/projected/0c23dce2-a24c-4f57-9311-56675376c95e-kube-api-access-ch7rt\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.992423 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-apiservice-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.994928 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-webhook-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.014443 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch7rt\" (UniqueName: \"kubernetes.io/projected/0c23dce2-a24c-4f57-9311-56675376c95e-kube-api-access-ch7rt\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.088178 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.449209 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.761799 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.824077 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.863176 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.863238 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.863299 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.864201 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.864267 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad" gracePeriod=600 Mar 16 00:22:32 crc kubenswrapper[4816]: I0316 00:22:32.396481 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad" exitCode=0 Mar 16 00:22:32 crc kubenswrapper[4816]: I0316 00:22:32.396911 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad"} Mar 16 00:22:32 crc kubenswrapper[4816]: I0316 00:22:32.397003 4816 scope.go:117] "RemoveContainer" containerID="054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.153265 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-kd6nx"] Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.154125 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.173122 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-xgl2z" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.211659 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-kd6nx"] Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.313992 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjjm\" (UniqueName: \"kubernetes.io/projected/f26ba6ee-c940-434d-80fe-81c813576ac9-kube-api-access-kpjjm\") pod \"interconnect-operator-5bb49f789d-kd6nx\" (UID: \"f26ba6ee-c940-434d-80fe-81c813576ac9\") " pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.415626 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjjm\" (UniqueName: \"kubernetes.io/projected/f26ba6ee-c940-434d-80fe-81c813576ac9-kube-api-access-kpjjm\") pod \"interconnect-operator-5bb49f789d-kd6nx\" (UID: \"f26ba6ee-c940-434d-80fe-81c813576ac9\") " pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.442774 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjjm\" (UniqueName: \"kubernetes.io/projected/f26ba6ee-c940-434d-80fe-81c813576ac9-kube-api-access-kpjjm\") pod \"interconnect-operator-5bb49f789d-kd6nx\" (UID: \"f26ba6ee-c940-434d-80fe-81c813576ac9\") " pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.468906 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" Mar 16 00:22:35 crc kubenswrapper[4816]: I0316 00:22:35.179206 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:35 crc kubenswrapper[4816]: I0316 00:22:35.179421 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jjrq7" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="registry-server" containerID="cri-o://f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" gracePeriod=2 Mar 16 00:22:36 crc kubenswrapper[4816]: I0316 00:22:36.169448 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:36 crc kubenswrapper[4816]: I0316 00:22:36.169684 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-284nb" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" containerID="cri-o://408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" gracePeriod=2 Mar 16 00:22:36 crc kubenswrapper[4816]: I0316 00:22:36.436280 4816 generic.go:334] "Generic (PLEG): container finished" podID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerID="f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" exitCode=0 Mar 16 00:22:36 crc kubenswrapper[4816]: I0316 00:22:36.436327 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerDied","Data":"f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a"} Mar 16 00:22:37 crc kubenswrapper[4816]: I0316 00:22:37.446675 4816 generic.go:334] "Generic (PLEG): container finished" podID="449b2c21-4396-4d46-af73-e670b282f831" containerID="408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" exitCode=0 Mar 16 00:22:37 crc kubenswrapper[4816]: I0316 00:22:37.446755 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerDied","Data":"408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861"} Mar 16 00:22:38 crc kubenswrapper[4816]: E0316 00:22:38.792517 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Mar 16 00:22:38 crc kubenswrapper[4816]: E0316 00:22:38.793008 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk_openshift-operators(9a808114-3164-4abe-a481-1b5d3b9df2a0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:22:38 crc kubenswrapper[4816]: E0316 00:22:38.794566 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" podUID="9a808114-3164-4abe-a481-1b5d3b9df2a0" Mar 16 00:22:39 crc kubenswrapper[4816]: E0316 00:22:39.474561 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" podUID="9a808114-3164-4abe-a481-1b5d3b9df2a0" Mar 16 00:22:40 crc kubenswrapper[4816]: E0316 00:22:40.499696 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a is running failed: container process not found" containerID="f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:40 crc kubenswrapper[4816]: E0316 00:22:40.500696 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a is running failed: container process not found" containerID="f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:40 crc kubenswrapper[4816]: E0316 00:22:40.504950 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a is running failed: container process not found" containerID="f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:40 crc kubenswrapper[4816]: E0316 00:22:40.505056 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-jjrq7" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="registry-server" Mar 16 00:22:41 crc kubenswrapper[4816]: E0316 00:22:41.726015 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861 is running failed: container process not found" containerID="408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:41 crc kubenswrapper[4816]: E0316 00:22:41.726452 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861 is running failed: container process not found" containerID="408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:41 crc kubenswrapper[4816]: E0316 00:22:41.726842 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861 is running failed: container process not found" containerID="408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:41 crc kubenswrapper[4816]: E0316 00:22:41.726888 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-284nb" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.511107 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.511351 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:dc62889b883f597de91b5389cc52c84c607247d49a807693be2f688e4703dfc3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:a223bab813b82d698992490bbb60927f6288a83ba52d539836c250e1471f6d34,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:e797cdb47beef40b04da7b6d645bca3dc32e6247003c45b56b38efd9e13bf01c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:093d2731ac848ed5fd57356b155a19d3bf7b8db96d95b09c5d0095e143f7254f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:7d662a120305e2528acc7e9142b770b5b6a7f4932ddfcadfa4ac953935124895,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:75465aabb0aa427a5c531a8fcde463f6d119afbcc618ebcbf6b7ee9bc8aad160,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:dc18c8d6a4a9a0a574a57cc5082c8a9b26023bd6d69b9732892d584c1dfe5070,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:369729978cecdc13c99ef3d179f8eb8a450a4a0cb70b63c27a55a15d1710ba27,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:d8c7a61d147f62b204d5c5f16864386025393453c9a81ea327bbd25d7765d611,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:b4a6eb1cc118a4334b424614959d8b7f361ddd779b3a72690ca49b0a3f26d9b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:21d4fff670893ba4b7fbc528cd49f8b71c8281cede9ef84f0697065bb6a7fc50,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:12d9dbe297a1c3b9df671f21156992082bc483887d851fafe76e5d17321ff474,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:e65c37f04f6d76a0cbfe05edb3cddf6a8f14f859ee35cf3aebea8fcb991d2c19,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:48e4e178c6eeaa9d5dd77a591c185a311b4b4a5caadb7199d48463123e31dc9e,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mz5bq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-59bdc8b94-w6wv7_openshift-operators(8d0f60fa-8d26-43ea-a680-1d3a92dd270d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.513185 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" podUID="8d0f60fa-8d26-43ea-a680-1d3a92dd270d" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.643625 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.643803 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4582c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_openshift-marketplace(1da45fda-a8cc-46c1-8831-58418ecc9819): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.645212 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.084182 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.153940 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content\") pod \"449b2c21-4396-4d46-af73-e670b282f831\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.154023 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities\") pod \"449b2c21-4396-4d46-af73-e670b282f831\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.154073 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44s4s\" (UniqueName: \"kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s\") pod \"449b2c21-4396-4d46-af73-e670b282f831\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.155692 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities" (OuterVolumeSpecName: "utilities") pod "449b2c21-4396-4d46-af73-e670b282f831" (UID: "449b2c21-4396-4d46-af73-e670b282f831"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.164199 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s" (OuterVolumeSpecName: "kube-api-access-44s4s") pod "449b2c21-4396-4d46-af73-e670b282f831" (UID: "449b2c21-4396-4d46-af73-e670b282f831"). InnerVolumeSpecName "kube-api-access-44s4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.255474 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.255854 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44s4s\" (UniqueName: \"kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.293795 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-kd6nx"] Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.294440 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:43 crc kubenswrapper[4816]: W0316 00:22:43.301715 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf26ba6ee_c940_434d_80fe_81c813576ac9.slice/crio-466dbd55112a9fcfa484c0e2ae0d33ac4f4113021db62a805d38c5c3536098c2 WatchSource:0}: Error finding container 466dbd55112a9fcfa484c0e2ae0d33ac4f4113021db62a805d38c5c3536098c2: Status 404 returned error can't find the container with id 466dbd55112a9fcfa484c0e2ae0d33ac4f4113021db62a805d38c5c3536098c2 Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.342528 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "449b2c21-4396-4d46-af73-e670b282f831" (UID: "449b2c21-4396-4d46-af73-e670b282f831"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.356653 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content\") pod \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.356737 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62z8f\" (UniqueName: \"kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f\") pod \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.356840 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities\") pod \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.357084 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.358347 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities" (OuterVolumeSpecName: "utilities") pod "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" (UID: "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.366636 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f" (OuterVolumeSpecName: "kube-api-access-62z8f") pod "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" (UID: "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2"). InnerVolumeSpecName "kube-api-access-62z8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.419275 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-8494b79c9c-fmbm9"] Mar 16 00:22:43 crc kubenswrapper[4816]: W0316 00:22:43.428063 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c23dce2_a24c_4f57_9311_56675376c95e.slice/crio-307965d3063d52c3beb9c4eb556854acc9650d4c029c1214cf342be34f6b3bb0 WatchSource:0}: Error finding container 307965d3063d52c3beb9c4eb556854acc9650d4c029c1214cf342be34f6b3bb0: Status 404 returned error can't find the container with id 307965d3063d52c3beb9c4eb556854acc9650d4c029c1214cf342be34f6b3bb0 Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.437575 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" (UID: "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.458174 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.458215 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62z8f\" (UniqueName: \"kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.458231 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.494605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" event={"ID":"f26ba6ee-c940-434d-80fe-81c813576ac9","Type":"ContainerStarted","Data":"466dbd55112a9fcfa484c0e2ae0d33ac4f4113021db62a805d38c5c3536098c2"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.496743 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerDied","Data":"5750fa2beb8413e4bbf66bc2e4ec103cf698ec3743191fcfdc001a528f09e12f"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.496782 4816 scope.go:117] "RemoveContainer" containerID="f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.496882 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.501078 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" event={"ID":"f24959c1-f57f-4bf6-8a55-c8a35173ff8b","Type":"ContainerStarted","Data":"230879dbf66fa1b269eddaa463bf2ce64a38c23de24acb9dfaabeaf7f4d8419a"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.501747 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.506965 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" event={"ID":"0c23dce2-a24c-4f57-9311-56675376c95e","Type":"ContainerStarted","Data":"307965d3063d52c3beb9c4eb556854acc9650d4c029c1214cf342be34f6b3bb0"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.527836 4816 scope.go:117] "RemoveContainer" containerID="79acb07939ccc5b80b6a155b3eb5d1de07ee6d3e3a71e7de4d918cb8ef32d1ed" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.528036 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.538955 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" podStartSLOduration=1.956176624 podStartE2EDuration="18.538936706s" podCreationTimestamp="2026-03-16 00:22:25 +0000 UTC" firstStartedPulling="2026-03-16 00:22:26.158507812 +0000 UTC m=+939.254807765" lastFinishedPulling="2026-03-16 00:22:42.741267904 +0000 UTC m=+955.837567847" observedRunningTime="2026-03-16 00:22:43.529318319 +0000 UTC m=+956.625618272" watchObservedRunningTime="2026-03-16 00:22:43.538936706 +0000 UTC m=+956.635236659" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.540676 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:43 crc kubenswrapper[4816]: E0316 00:22:43.553745 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c\\\"\"" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" podUID="8d0f60fa-8d26-43ea-a680-1d3a92dd270d" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.562538 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerDied","Data":"d18b65047ef76f8aa8382e87295979d6bc4c30a17f5c2e1a21242e76f4c64c95"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.562804 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" event={"ID":"36951342-3370-4291-baa3-2612f64036fd","Type":"ContainerStarted","Data":"56bfd05b1bd5295a86432769cfca977bfdcaf5320dded943aae99ff75ddd2b3d"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.562820 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" event={"ID":"562f24fe-5c4c-4540-96ae-6e01f539141b","Type":"ContainerStarted","Data":"0b033ee0ae1fd900374e0904d62ee38cb53bab56f3b6256c9a5ba725203718cb"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.566891 4816 scope.go:117] "RemoveContainer" containerID="e5839a641f5e4b47edb9ba28f2918214558d3b4454dab06397a37f22bd1120b7" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.584589 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.584645 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.591051 4816 scope.go:117] "RemoveContainer" containerID="408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.620618 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" podStartSLOduration=2.476263188 podStartE2EDuration="19.620582576s" podCreationTimestamp="2026-03-16 00:22:24 +0000 UTC" firstStartedPulling="2026-03-16 00:22:25.597119831 +0000 UTC m=+938.693419784" lastFinishedPulling="2026-03-16 00:22:42.741439219 +0000 UTC m=+955.837739172" observedRunningTime="2026-03-16 00:22:43.615325135 +0000 UTC m=+956.711625088" watchObservedRunningTime="2026-03-16 00:22:43.620582576 +0000 UTC m=+956.716882529" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.643730 4816 scope.go:117] "RemoveContainer" containerID="6675088a9bc77bc6e858928c42b983a7c8adb51e5f6f5dc372c160b6c32fce59" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.658952 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" podStartSLOduration=2.917420158 podStartE2EDuration="19.65893207s" podCreationTimestamp="2026-03-16 00:22:24 +0000 UTC" firstStartedPulling="2026-03-16 00:22:25.999128564 +0000 UTC m=+939.095428517" lastFinishedPulling="2026-03-16 00:22:42.740640476 +0000 UTC m=+955.836940429" observedRunningTime="2026-03-16 00:22:43.656001486 +0000 UTC m=+956.752301439" watchObservedRunningTime="2026-03-16 00:22:43.65893207 +0000 UTC m=+956.755232023" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.682785 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" path="/var/lib/kubelet/pods/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2/volumes" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.708615 4816 scope.go:117] "RemoveContainer" containerID="1891eef8092d3a2994e7d2d76188c04ec2abc90123f32af569269e31014fdb64" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.720016 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.741404 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:45 crc kubenswrapper[4816]: I0316 00:22:45.676512 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449b2c21-4396-4d46-af73-e670b282f831" path="/var/lib/kubelet/pods/449b2c21-4396-4d46-af73-e670b282f831/volumes" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.589972 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" event={"ID":"0c23dce2-a24c-4f57-9311-56675376c95e","Type":"ContainerStarted","Data":"09c44097f0ad4bab89366142ec9f2d996be7ca94ef2aea5cae839d9f0610e896"} Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.609369 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" podStartSLOduration=14.341006016 podStartE2EDuration="17.609343462s" podCreationTimestamp="2026-03-16 00:22:30 +0000 UTC" firstStartedPulling="2026-03-16 00:22:43.431866175 +0000 UTC m=+956.528166128" lastFinishedPulling="2026-03-16 00:22:46.700203621 +0000 UTC m=+959.796503574" observedRunningTime="2026-03-16 00:22:47.605779309 +0000 UTC m=+960.702079262" watchObservedRunningTime="2026-03-16 00:22:47.609343462 +0000 UTC m=+960.705643415" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.944766 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945235 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945247 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945260 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="extract-content" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945268 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="extract-content" Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945276 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="extract-utilities" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945284 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="extract-utilities" Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945292 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="extract-content" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945299 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="extract-content" Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945311 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="extract-utilities" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945317 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="extract-utilities" Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945333 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945340 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945447 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945457 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.946339 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.951037 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.951412 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.951579 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.951755 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.952216 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-zjx5n" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.952468 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.952528 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.952673 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.962860 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.975917 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073544 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073611 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073646 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073673 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073751 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073841 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073869 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073897 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073931 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074021 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074076 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074108 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/819af9fc-6db9-4743-bd06-f844f5ef5b0d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074136 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074189 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074210 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176398 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176474 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176512 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176575 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176628 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176868 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176908 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176955 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176983 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177019 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177057 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177086 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177118 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177150 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/819af9fc-6db9-4743-bd06-f844f5ef5b0d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177179 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177303 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177678 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177956 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.178056 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.178477 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.178661 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.178824 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.178999 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.182476 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.182967 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.183988 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/819af9fc-6db9-4743-bd06-f844f5ef5b0d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.184117 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.185244 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.186763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.190496 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.264417 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.075150 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:22:52 crc kubenswrapper[4816]: W0316 00:22:52.098728 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod819af9fc_6db9_4743_bd06_f844f5ef5b0d.slice/crio-4adaa61025ba1e8ab254fb2d53985c4f897bee3baaba37099e987fb4605316da WatchSource:0}: Error finding container 4adaa61025ba1e8ab254fb2d53985c4f897bee3baaba37099e987fb4605316da: Status 404 returned error can't find the container with id 4adaa61025ba1e8ab254fb2d53985c4f897bee3baaba37099e987fb4605316da Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.617765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerStarted","Data":"4adaa61025ba1e8ab254fb2d53985c4f897bee3baaba37099e987fb4605316da"} Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.618905 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" event={"ID":"f26ba6ee-c940-434d-80fe-81c813576ac9","Type":"ContainerStarted","Data":"b9e03202df033fa3e436ece2c2d18d48351c2b0554812fb3c9f999fac9ec3ca2"} Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.621301 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" event={"ID":"9a808114-3164-4abe-a481-1b5d3b9df2a0","Type":"ContainerStarted","Data":"d58cfae0108d1b4f237761f15cab98577ff90c022b58bb47cf4c218838194f3e"} Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.670513 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" podStartSLOduration=-9223372008.184288 podStartE2EDuration="28.670487269s" podCreationTimestamp="2026-03-16 00:22:24 +0000 UTC" firstStartedPulling="2026-03-16 00:22:25.718897757 +0000 UTC m=+938.815197710" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:22:52.667150163 +0000 UTC m=+965.763450116" watchObservedRunningTime="2026-03-16 00:22:52.670487269 +0000 UTC m=+965.766787222" Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.671566 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" podStartSLOduration=11.061688506 podStartE2EDuration="19.671542309s" podCreationTimestamp="2026-03-16 00:22:33 +0000 UTC" firstStartedPulling="2026-03-16 00:22:43.309492542 +0000 UTC m=+956.405792495" lastFinishedPulling="2026-03-16 00:22:51.919346345 +0000 UTC m=+965.015646298" observedRunningTime="2026-03-16 00:22:52.636691416 +0000 UTC m=+965.732991369" watchObservedRunningTime="2026-03-16 00:22:52.671542309 +0000 UTC m=+965.767842262" Mar 16 00:22:55 crc kubenswrapper[4816]: I0316 00:22:55.645467 4816 generic.go:334] "Generic (PLEG): container finished" podID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerID="f0b6604af19fa4322ba98c463fa0ec289db6cb21f72ae73d9215646940dfdad1" exitCode=0 Mar 16 00:22:55 crc kubenswrapper[4816]: I0316 00:22:55.646515 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" event={"ID":"1da45fda-a8cc-46c1-8831-58418ecc9819","Type":"ContainerDied","Data":"f0b6604af19fa4322ba98c463fa0ec289db6cb21f72ae73d9215646940dfdad1"} Mar 16 00:22:55 crc kubenswrapper[4816]: I0316 00:22:55.749667 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:56 crc kubenswrapper[4816]: I0316 00:22:56.658016 4816 generic.go:334] "Generic (PLEG): container finished" podID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerID="7595ee672d5f6277a3926773b74d4e4c2739ea44ae1b6448b9f0308af4417a33" exitCode=0 Mar 16 00:22:56 crc kubenswrapper[4816]: I0316 00:22:56.658061 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" event={"ID":"1da45fda-a8cc-46c1-8831-58418ecc9819","Type":"ContainerDied","Data":"7595ee672d5f6277a3926773b74d4e4c2739ea44ae1b6448b9f0308af4417a33"} Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.791522 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.892582 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle\") pod \"1da45fda-a8cc-46c1-8831-58418ecc9819\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.892653 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util\") pod \"1da45fda-a8cc-46c1-8831-58418ecc9819\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.892719 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4582c\" (UniqueName: \"kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c\") pod \"1da45fda-a8cc-46c1-8831-58418ecc9819\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.893430 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle" (OuterVolumeSpecName: "bundle") pod "1da45fda-a8cc-46c1-8831-58418ecc9819" (UID: "1da45fda-a8cc-46c1-8831-58418ecc9819"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.898383 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c" (OuterVolumeSpecName: "kube-api-access-4582c") pod "1da45fda-a8cc-46c1-8831-58418ecc9819" (UID: "1da45fda-a8cc-46c1-8831-58418ecc9819"). InnerVolumeSpecName "kube-api-access-4582c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.904833 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util" (OuterVolumeSpecName: "util") pod "1da45fda-a8cc-46c1-8831-58418ecc9819" (UID: "1da45fda-a8cc-46c1-8831-58418ecc9819"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.993838 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.993879 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4582c\" (UniqueName: \"kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.993895 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:03 crc kubenswrapper[4816]: I0316 00:23:03.704934 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" event={"ID":"1da45fda-a8cc-46c1-8831-58418ecc9819","Type":"ContainerDied","Data":"423ca09a07f2da9396c84b4219be8387d28d6dd64d1f4c92b01055a8dae546ea"} Mar 16 00:23:03 crc kubenswrapper[4816]: I0316 00:23:03.704977 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="423ca09a07f2da9396c84b4219be8387d28d6dd64d1f4c92b01055a8dae546ea" Mar 16 00:23:03 crc kubenswrapper[4816]: I0316 00:23:03.705104 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:23:03 crc kubenswrapper[4816]: I0316 00:23:03.727885 4816 scope.go:117] "RemoveContainer" containerID="185e1a33c845773d7893f16759f110b3a4a2b357c62cdafa5e5060cabc62a64e" Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.712541 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" event={"ID":"8d0f60fa-8d26-43ea-a680-1d3a92dd270d","Type":"ContainerStarted","Data":"ebc2db9eb32f16fc38e87be7218d1a538aa38bbb30fda48350609a1429f10a8e"} Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.713973 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerStarted","Data":"b584e1a8e4b0ac65a25b32d47ec6ced936ac543f56fdddb244f8dbc549daaeee"} Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.714942 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.726546 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.783442 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" podStartSLOduration=1.779362475 podStartE2EDuration="39.783426578s" podCreationTimestamp="2026-03-16 00:22:25 +0000 UTC" firstStartedPulling="2026-03-16 00:22:25.967703389 +0000 UTC m=+939.064003342" lastFinishedPulling="2026-03-16 00:23:03.971767502 +0000 UTC m=+977.068067445" observedRunningTime="2026-03-16 00:23:04.780195375 +0000 UTC m=+977.876495328" watchObservedRunningTime="2026-03-16 00:23:04.783426578 +0000 UTC m=+977.879726531" Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.878928 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.905254 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:23:06 crc kubenswrapper[4816]: I0316 00:23:06.723807 4816 generic.go:334] "Generic (PLEG): container finished" podID="819af9fc-6db9-4743-bd06-f844f5ef5b0d" containerID="b584e1a8e4b0ac65a25b32d47ec6ced936ac543f56fdddb244f8dbc549daaeee" exitCode=0 Mar 16 00:23:06 crc kubenswrapper[4816]: I0316 00:23:06.725214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerDied","Data":"b584e1a8e4b0ac65a25b32d47ec6ced936ac543f56fdddb244f8dbc549daaeee"} Mar 16 00:23:12 crc kubenswrapper[4816]: I0316 00:23:12.756769 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerStarted","Data":"6c31b45225359d15c615de3dc1429eddcd946a9788d4ec8d328f458ff6087e54"} Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.399221 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs"] Mar 16 00:23:13 crc kubenswrapper[4816]: E0316 00:23:13.399448 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="util" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.399459 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="util" Mar 16 00:23:13 crc kubenswrapper[4816]: E0316 00:23:13.399472 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="extract" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.399478 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="extract" Mar 16 00:23:13 crc kubenswrapper[4816]: E0316 00:23:13.399484 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="pull" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.399490 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="pull" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.399590 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="extract" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.400013 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.403647 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.404175 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.404326 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-qnvfr" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.437965 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs"] Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.527780 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmxlz\" (UniqueName: \"kubernetes.io/projected/eb3fdaff-975a-4df2-a9f2-67b63b708615-kube-api-access-rmxlz\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.527854 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb3fdaff-975a-4df2-a9f2-67b63b708615-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.629031 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmxlz\" (UniqueName: \"kubernetes.io/projected/eb3fdaff-975a-4df2-a9f2-67b63b708615-kube-api-access-rmxlz\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.629096 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb3fdaff-975a-4df2-a9f2-67b63b708615-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.629761 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb3fdaff-975a-4df2-a9f2-67b63b708615-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.649739 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmxlz\" (UniqueName: \"kubernetes.io/projected/eb3fdaff-975a-4df2-a9f2-67b63b708615-kube-api-access-rmxlz\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.764229 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.768933 4816 generic.go:334] "Generic (PLEG): container finished" podID="819af9fc-6db9-4743-bd06-f844f5ef5b0d" containerID="6c31b45225359d15c615de3dc1429eddcd946a9788d4ec8d328f458ff6087e54" exitCode=0 Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.768990 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerDied","Data":"6c31b45225359d15c615de3dc1429eddcd946a9788d4ec8d328f458ff6087e54"} Mar 16 00:23:14 crc kubenswrapper[4816]: I0316 00:23:14.298488 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs"] Mar 16 00:23:14 crc kubenswrapper[4816]: I0316 00:23:14.775590 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" event={"ID":"eb3fdaff-975a-4df2-a9f2-67b63b708615","Type":"ContainerStarted","Data":"f27a01e50df74c0eaacba0e1f44ee68a7de94b8d48de71b66a41e23589e4f2a6"} Mar 16 00:23:14 crc kubenswrapper[4816]: I0316 00:23:14.777758 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerStarted","Data":"bf1222611fb6e91e46e59464f518afd20a81523a36e5eac0ce8cf4090ae19ce7"} Mar 16 00:23:14 crc kubenswrapper[4816]: I0316 00:23:14.777929 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:23:14 crc kubenswrapper[4816]: I0316 00:23:14.809315 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=15.807364772 podStartE2EDuration="27.809294535s" podCreationTimestamp="2026-03-16 00:22:47 +0000 UTC" firstStartedPulling="2026-03-16 00:22:52.100172581 +0000 UTC m=+965.196472534" lastFinishedPulling="2026-03-16 00:23:04.102102334 +0000 UTC m=+977.198402297" observedRunningTime="2026-03-16 00:23:14.80462967 +0000 UTC m=+987.900929633" watchObservedRunningTime="2026-03-16 00:23:14.809294535 +0000 UTC m=+987.905594488" Mar 16 00:23:17 crc kubenswrapper[4816]: I0316 00:23:17.795075 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" event={"ID":"eb3fdaff-975a-4df2-a9f2-67b63b708615","Type":"ContainerStarted","Data":"87d86e56cfeee83f609cc9971120bf75c47e31110d4c0a159b405590a73e8b2f"} Mar 16 00:23:17 crc kubenswrapper[4816]: I0316 00:23:17.822696 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" podStartSLOduration=1.679253431 podStartE2EDuration="4.822681462s" podCreationTimestamp="2026-03-16 00:23:13 +0000 UTC" firstStartedPulling="2026-03-16 00:23:14.301862057 +0000 UTC m=+987.398162010" lastFinishedPulling="2026-03-16 00:23:17.445290078 +0000 UTC m=+990.541590041" observedRunningTime="2026-03-16 00:23:17.817655357 +0000 UTC m=+990.913955310" watchObservedRunningTime="2026-03-16 00:23:17.822681462 +0000 UTC m=+990.918981415" Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.945682 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-ssr4q"] Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.947066 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.949156 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-m24cv" Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.959934 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.960681 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.963982 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-ssr4q"] Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.028069 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.028179 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnn6l\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-kube-api-access-wnn6l\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.129938 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnn6l\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-kube-api-access-wnn6l\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.130286 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.151495 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnn6l\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-kube-api-access-wnn6l\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.154359 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.264163 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.796690 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-ssr4q"] Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.819893 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" event={"ID":"ca67da37-05ff-4b13-aeea-04ac7f17ffc0","Type":"ContainerStarted","Data":"41b69df87f03003db592c1f15b5da63cc8346c7d7995b899a24b7152a94405f1"} Mar 16 00:23:23 crc kubenswrapper[4816]: I0316 00:23:23.363859 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="819af9fc-6db9-4743-bd06-f844f5ef5b0d" containerName="elasticsearch" probeResult="failure" output=< Mar 16 00:23:23 crc kubenswrapper[4816]: {"timestamp": "2026-03-16T00:23:23+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 16 00:23:23 crc kubenswrapper[4816]: > Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.747256 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-25jvg"] Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.748861 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.751866 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2z8m6" Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.776737 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-25jvg"] Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.879952 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.879999 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gtkg\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-kube-api-access-6gtkg\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.981281 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.981700 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gtkg\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-kube-api-access-6gtkg\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.007966 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.008086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gtkg\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-kube-api-access-6gtkg\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.074851 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.285532 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.289958 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.294869 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.295596 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.295695 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.295770 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.295846 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.388617 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.388660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.388881 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.388959 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389043 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389085 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389126 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389158 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wphcb\" (UniqueName: \"kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389253 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389288 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389376 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389458 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491510 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491600 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491629 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491666 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491690 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491734 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491763 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491799 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491823 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wphcb\" (UniqueName: \"kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491860 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491900 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491931 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492113 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492283 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492346 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492532 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492789 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492821 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492952 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.493037 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.493166 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.498331 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.502340 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.510589 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wphcb\" (UniqueName: \"kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.607133 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4816]: I0316 00:23:26.868349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" event={"ID":"ca67da37-05ff-4b13-aeea-04ac7f17ffc0","Type":"ContainerStarted","Data":"c208eaca8aeab9d2179d24b48a1ba299a7908a41d5c5a9debd4fcf20cd66187c"} Mar 16 00:23:26 crc kubenswrapper[4816]: I0316 00:23:26.868898 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:26 crc kubenswrapper[4816]: I0316 00:23:26.895339 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" podStartSLOduration=2.038849025 podStartE2EDuration="6.894646139s" podCreationTimestamp="2026-03-16 00:23:20 +0000 UTC" firstStartedPulling="2026-03-16 00:23:21.813393814 +0000 UTC m=+994.909693767" lastFinishedPulling="2026-03-16 00:23:26.669190928 +0000 UTC m=+999.765490881" observedRunningTime="2026-03-16 00:23:26.889726217 +0000 UTC m=+999.986026160" watchObservedRunningTime="2026-03-16 00:23:26.894646139 +0000 UTC m=+999.990946092" Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.149097 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-25jvg"] Mar 16 00:23:27 crc kubenswrapper[4816]: W0316 00:23:27.154458 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe81d263_aafd_4bdb_a088_d4bc52592a2d.slice/crio-cea0786ba796e78188331f295762f145ba8b98d9895fd4c229d90ebecb590447 WatchSource:0}: Error finding container cea0786ba796e78188331f295762f145ba8b98d9895fd4c229d90ebecb590447: Status 404 returned error can't find the container with id cea0786ba796e78188331f295762f145ba8b98d9895fd4c229d90ebecb590447 Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.155204 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.875069 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" event={"ID":"fe81d263-aafd-4bdb-a088-d4bc52592a2d","Type":"ContainerStarted","Data":"2dc9dc42025bc940a969c3b552ac8216b58ee0162daebc16d46303d8726b91bd"} Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.875433 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" event={"ID":"fe81d263-aafd-4bdb-a088-d4bc52592a2d","Type":"ContainerStarted","Data":"cea0786ba796e78188331f295762f145ba8b98d9895fd4c229d90ebecb590447"} Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.876687 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"1fe11315-8a31-4f80-b084-fdb8542e0074","Type":"ContainerStarted","Data":"be6f67c8612bf615c5637b63abc0ae83d660784155c8869bc8e23fccdb9f8c21"} Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.893603 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" podStartSLOduration=3.893579635 podStartE2EDuration="3.893579635s" podCreationTimestamp="2026-03-16 00:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:23:27.893558375 +0000 UTC m=+1000.989858328" watchObservedRunningTime="2026-03-16 00:23:27.893579635 +0000 UTC m=+1000.989879598" Mar 16 00:23:28 crc kubenswrapper[4816]: I0316 00:23:28.963260 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.269499 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.809912 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-9q9nz"] Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.810758 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.819396 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9q9nz"] Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.823358 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-22czx" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.885230 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpxq\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-kube-api-access-tkpxq\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.885529 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-bound-sa-token\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.987103 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpxq\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-kube-api-access-tkpxq\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.987171 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-bound-sa-token\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:32 crc kubenswrapper[4816]: I0316 00:23:32.008392 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-bound-sa-token\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:32 crc kubenswrapper[4816]: I0316 00:23:32.008994 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpxq\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-kube-api-access-tkpxq\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:32 crc kubenswrapper[4816]: I0316 00:23:32.132495 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.079925 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9q9nz"] Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.930781 4816 generic.go:334] "Generic (PLEG): container finished" podID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerID="260d19a4e09f99c50d12150236c719bb65b9e9ace774386f70e495d800792e5c" exitCode=0 Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.930865 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"1fe11315-8a31-4f80-b084-fdb8542e0074","Type":"ContainerDied","Data":"260d19a4e09f99c50d12150236c719bb65b9e9ace774386f70e495d800792e5c"} Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.934892 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9q9nz" event={"ID":"88d51e1b-a795-4157-82b4-8a74d228e698","Type":"ContainerStarted","Data":"9d677ad5f20392a8d6ce9b563ba14b7ad91b7a163a96a096480c28c9940d205d"} Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.935500 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9q9nz" event={"ID":"88d51e1b-a795-4157-82b4-8a74d228e698","Type":"ContainerStarted","Data":"2da63d4a0826b4121db74dca7d4420ff200b18389893711ba7758a5be940edb4"} Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.994459 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-9q9nz" podStartSLOduration=3.99443789 podStartE2EDuration="3.99443789s" podCreationTimestamp="2026-03-16 00:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:23:34.990509906 +0000 UTC m=+1008.086809859" watchObservedRunningTime="2026-03-16 00:23:34.99443789 +0000 UTC m=+1008.090737863" Mar 16 00:23:35 crc kubenswrapper[4816]: I0316 00:23:35.351640 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:35 crc kubenswrapper[4816]: I0316 00:23:35.943846 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"1fe11315-8a31-4f80-b084-fdb8542e0074","Type":"ContainerStarted","Data":"5ab87e5b52ff667c30c02fb846b96580f3a8bdb5023de19607d2656e42f7c8c7"} Mar 16 00:23:35 crc kubenswrapper[4816]: I0316 00:23:35.966034 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=4.424430365 podStartE2EDuration="10.966017769s" podCreationTimestamp="2026-03-16 00:23:25 +0000 UTC" firstStartedPulling="2026-03-16 00:23:27.166835284 +0000 UTC m=+1000.263135237" lastFinishedPulling="2026-03-16 00:23:33.708422688 +0000 UTC m=+1006.804722641" observedRunningTime="2026-03-16 00:23:35.964492225 +0000 UTC m=+1009.060792178" watchObservedRunningTime="2026-03-16 00:23:35.966017769 +0000 UTC m=+1009.062317722" Mar 16 00:23:36 crc kubenswrapper[4816]: I0316 00:23:36.956920 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="docker-build" containerID="cri-o://5ab87e5b52ff667c30c02fb846b96580f3a8bdb5023de19607d2656e42f7c8c7" gracePeriod=30 Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.011446 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.015097 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.018018 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.020165 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.020540 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.038872 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.158978 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zgw\" (UniqueName: \"kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159041 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159070 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159094 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159148 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159173 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159203 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159267 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159348 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159386 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159506 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.261055 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.261403 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.261531 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.261707 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.261799 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262014 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262199 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zgw\" (UniqueName: \"kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262321 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262044 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262092 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262715 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262762 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262814 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262854 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262938 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262969 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.263137 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.263396 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.263624 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.264338 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.269088 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.270091 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.276877 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zgw\" (UniqueName: \"kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.332359 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4816]: I0316 00:23:38.319975 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:23:38 crc kubenswrapper[4816]: I0316 00:23:38.970924 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerStarted","Data":"afc880cd89cb3d0dd5c36035edaf726279cc4d27b43fc12a6df286ecc563c314"} Mar 16 00:23:39 crc kubenswrapper[4816]: I0316 00:23:39.984119 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerStarted","Data":"863e0a61dd1fbf63d5e851bf29401be711920c970a57fdf0c47e4215e8849370"} Mar 16 00:23:39 crc kubenswrapper[4816]: I0316 00:23:39.991992 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_1fe11315-8a31-4f80-b084-fdb8542e0074/docker-build/0.log" Mar 16 00:23:39 crc kubenswrapper[4816]: I0316 00:23:39.997115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"1fe11315-8a31-4f80-b084-fdb8542e0074","Type":"ContainerDied","Data":"5ab87e5b52ff667c30c02fb846b96580f3a8bdb5023de19607d2656e42f7c8c7"} Mar 16 00:23:39 crc kubenswrapper[4816]: I0316 00:23:39.997333 4816 generic.go:334] "Generic (PLEG): container finished" podID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerID="5ab87e5b52ff667c30c02fb846b96580f3a8bdb5023de19607d2656e42f7c8c7" exitCode=1 Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.082352 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_1fe11315-8a31-4f80-b084-fdb8542e0074/docker-build/0.log" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.082643 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205165 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205268 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205326 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205362 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205404 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wphcb\" (UniqueName: \"kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205451 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205524 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205610 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205663 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205725 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205765 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205802 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205897 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.206152 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.206385 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.206405 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.206937 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.207023 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.207206 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.207209 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.207758 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208139 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208260 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208341 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208425 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208500 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208691 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208798 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208902 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208997 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.217258 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.220799 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb" (OuterVolumeSpecName: "kube-api-access-wphcb") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "kube-api-access-wphcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.222885 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.310324 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wphcb\" (UniqueName: \"kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.310367 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.310380 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.005323 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_1fe11315-8a31-4f80-b084-fdb8542e0074/docker-build/0.log" Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.006424 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.006544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"1fe11315-8a31-4f80-b084-fdb8542e0074","Type":"ContainerDied","Data":"be6f67c8612bf615c5637b63abc0ae83d660784155c8869bc8e23fccdb9f8c21"} Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.006698 4816 scope.go:117] "RemoveContainer" containerID="5ab87e5b52ff667c30c02fb846b96580f3a8bdb5023de19607d2656e42f7c8c7" Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.029594 4816 scope.go:117] "RemoveContainer" containerID="260d19a4e09f99c50d12150236c719bb65b9e9ace774386f70e495d800792e5c" Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.032956 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.043051 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.674707 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" path="/var/lib/kubelet/pods/1fe11315-8a31-4f80-b084-fdb8542e0074/volumes" Mar 16 00:23:48 crc kubenswrapper[4816]: I0316 00:23:48.049173 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4629507-876a-405c-891c-5dcd521cf590" containerID="863e0a61dd1fbf63d5e851bf29401be711920c970a57fdf0c47e4215e8849370" exitCode=0 Mar 16 00:23:48 crc kubenswrapper[4816]: I0316 00:23:48.049291 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerDied","Data":"863e0a61dd1fbf63d5e851bf29401be711920c970a57fdf0c47e4215e8849370"} Mar 16 00:23:49 crc kubenswrapper[4816]: I0316 00:23:49.059065 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4629507-876a-405c-891c-5dcd521cf590" containerID="23e82f1c92387ad59df9c54ccbf20b2c5dd61bbb9ff88c126dddd725b46d94c0" exitCode=0 Mar 16 00:23:49 crc kubenswrapper[4816]: I0316 00:23:49.059122 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerDied","Data":"23e82f1c92387ad59df9c54ccbf20b2c5dd61bbb9ff88c126dddd725b46d94c0"} Mar 16 00:23:49 crc kubenswrapper[4816]: I0316 00:23:49.099538 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_a4629507-876a-405c-891c-5dcd521cf590/manage-dockerfile/0.log" Mar 16 00:23:50 crc kubenswrapper[4816]: I0316 00:23:50.067054 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerStarted","Data":"569da32a07b2521faaf5205b0d1082783f2685ea5a711b0eadcc5491dd41185a"} Mar 16 00:23:50 crc kubenswrapper[4816]: I0316 00:23:50.098575 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=14.098513046 podStartE2EDuration="14.098513046s" podCreationTimestamp="2026-03-16 00:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:23:50.096288921 +0000 UTC m=+1023.192588884" watchObservedRunningTime="2026-03-16 00:23:50.098513046 +0000 UTC m=+1023.194813009" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.137147 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560344-qmt9b"] Mar 16 00:24:00 crc kubenswrapper[4816]: E0316 00:24:00.139888 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="manage-dockerfile" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.139905 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="manage-dockerfile" Mar 16 00:24:00 crc kubenswrapper[4816]: E0316 00:24:00.139932 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="docker-build" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.139940 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="docker-build" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.140064 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="docker-build" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.140546 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.143801 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-qmt9b"] Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.164644 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.164781 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.164954 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.266512 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksgrh\" (UniqueName: \"kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh\") pod \"auto-csr-approver-29560344-qmt9b\" (UID: \"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d\") " pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.367839 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksgrh\" (UniqueName: \"kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh\") pod \"auto-csr-approver-29560344-qmt9b\" (UID: \"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d\") " pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.386714 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksgrh\" (UniqueName: \"kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh\") pod \"auto-csr-approver-29560344-qmt9b\" (UID: \"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d\") " pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.486687 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.726726 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-qmt9b"] Mar 16 00:24:01 crc kubenswrapper[4816]: I0316 00:24:01.133921 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" event={"ID":"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d","Type":"ContainerStarted","Data":"31c29a72f860f347baf8d5002acc0aa9f6f3e1cd72c28219086ab49c38e3181a"} Mar 16 00:24:02 crc kubenswrapper[4816]: I0316 00:24:02.140510 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" event={"ID":"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d","Type":"ContainerStarted","Data":"6050167de1d894cd0016711271e17ed54f0e6320bd8403d36883159d39c3c966"} Mar 16 00:24:02 crc kubenswrapper[4816]: I0316 00:24:02.154608 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" podStartSLOduration=1.22612965 podStartE2EDuration="2.154591898s" podCreationTimestamp="2026-03-16 00:24:00 +0000 UTC" firstStartedPulling="2026-03-16 00:24:00.750791386 +0000 UTC m=+1033.847091339" lastFinishedPulling="2026-03-16 00:24:01.679253634 +0000 UTC m=+1034.775553587" observedRunningTime="2026-03-16 00:24:02.154267938 +0000 UTC m=+1035.250567901" watchObservedRunningTime="2026-03-16 00:24:02.154591898 +0000 UTC m=+1035.250891851" Mar 16 00:24:03 crc kubenswrapper[4816]: I0316 00:24:03.146549 4816 generic.go:334] "Generic (PLEG): container finished" podID="add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" containerID="6050167de1d894cd0016711271e17ed54f0e6320bd8403d36883159d39c3c966" exitCode=0 Mar 16 00:24:03 crc kubenswrapper[4816]: I0316 00:24:03.146604 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" event={"ID":"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d","Type":"ContainerDied","Data":"6050167de1d894cd0016711271e17ed54f0e6320bd8403d36883159d39c3c966"} Mar 16 00:24:04 crc kubenswrapper[4816]: I0316 00:24:04.382794 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:04 crc kubenswrapper[4816]: I0316 00:24:04.552145 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksgrh\" (UniqueName: \"kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh\") pod \"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d\" (UID: \"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d\") " Mar 16 00:24:04 crc kubenswrapper[4816]: I0316 00:24:04.557266 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh" (OuterVolumeSpecName: "kube-api-access-ksgrh") pod "add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" (UID: "add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d"). InnerVolumeSpecName "kube-api-access-ksgrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:24:04 crc kubenswrapper[4816]: I0316 00:24:04.653656 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksgrh\" (UniqueName: \"kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.160461 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" event={"ID":"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d","Type":"ContainerDied","Data":"31c29a72f860f347baf8d5002acc0aa9f6f3e1cd72c28219086ab49c38e3181a"} Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.160497 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31c29a72f860f347baf8d5002acc0aa9f6f3e1cd72c28219086ab49c38e3181a" Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.160542 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.215534 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-8bkf9"] Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.233573 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-8bkf9"] Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.675277 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cfda38e-dbdc-4b42-8a0d-964103ee01cd" path="/var/lib/kubelet/pods/6cfda38e-dbdc-4b42-8a0d-964103ee01cd/volumes" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.572157 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:41 crc kubenswrapper[4816]: E0316 00:24:41.572997 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" containerName="oc" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.573013 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" containerName="oc" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.573167 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" containerName="oc" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.574315 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.582846 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.662715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.662773 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.662845 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl7m9\" (UniqueName: \"kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.764257 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.764305 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.764351 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl7m9\" (UniqueName: \"kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.764824 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.764909 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.783220 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl7m9\" (UniqueName: \"kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.904295 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:42 crc kubenswrapper[4816]: I0316 00:24:42.402516 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:43 crc kubenswrapper[4816]: I0316 00:24:43.394461 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerID="9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d" exitCode=0 Mar 16 00:24:43 crc kubenswrapper[4816]: I0316 00:24:43.394531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerDied","Data":"9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d"} Mar 16 00:24:43 crc kubenswrapper[4816]: I0316 00:24:43.394872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerStarted","Data":"7206621fed299d699bb828068794b51424cc6881ba7958e6018750c2a55ad6a7"} Mar 16 00:24:45 crc kubenswrapper[4816]: I0316 00:24:45.408776 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerStarted","Data":"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff"} Mar 16 00:24:46 crc kubenswrapper[4816]: I0316 00:24:46.417998 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerID="90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff" exitCode=0 Mar 16 00:24:46 crc kubenswrapper[4816]: I0316 00:24:46.418069 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerDied","Data":"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff"} Mar 16 00:24:47 crc kubenswrapper[4816]: I0316 00:24:47.425882 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerStarted","Data":"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65"} Mar 16 00:24:47 crc kubenswrapper[4816]: I0316 00:24:47.446666 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cql79" podStartSLOduration=2.9410904540000002 podStartE2EDuration="6.446645611s" podCreationTimestamp="2026-03-16 00:24:41 +0000 UTC" firstStartedPulling="2026-03-16 00:24:43.39802039 +0000 UTC m=+1076.494320343" lastFinishedPulling="2026-03-16 00:24:46.903575547 +0000 UTC m=+1079.999875500" observedRunningTime="2026-03-16 00:24:47.442645932 +0000 UTC m=+1080.538945885" watchObservedRunningTime="2026-03-16 00:24:47.446645611 +0000 UTC m=+1080.542945574" Mar 16 00:24:51 crc kubenswrapper[4816]: I0316 00:24:51.904932 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:51 crc kubenswrapper[4816]: I0316 00:24:51.905686 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:51 crc kubenswrapper[4816]: I0316 00:24:51.949241 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:52 crc kubenswrapper[4816]: I0316 00:24:52.499349 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:52 crc kubenswrapper[4816]: I0316 00:24:52.540191 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:54 crc kubenswrapper[4816]: I0316 00:24:54.471852 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cql79" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="registry-server" containerID="cri-o://3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65" gracePeriod=2 Mar 16 00:24:54 crc kubenswrapper[4816]: I0316 00:24:54.847399 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.032537 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content\") pod \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.032655 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities\") pod \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.032753 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl7m9\" (UniqueName: \"kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9\") pod \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.033604 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities" (OuterVolumeSpecName: "utilities") pod "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" (UID: "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.037822 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9" (OuterVolumeSpecName: "kube-api-access-cl7m9") pod "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" (UID: "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd"). InnerVolumeSpecName "kube-api-access-cl7m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.099054 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" (UID: "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.134620 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl7m9\" (UniqueName: \"kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.134661 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.134674 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.479956 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerID="3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65" exitCode=0 Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.480029 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.480018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerDied","Data":"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65"} Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.480176 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerDied","Data":"7206621fed299d699bb828068794b51424cc6881ba7958e6018750c2a55ad6a7"} Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.480200 4816 scope.go:117] "RemoveContainer" containerID="3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.495132 4816 scope.go:117] "RemoveContainer" containerID="90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.508897 4816 scope.go:117] "RemoveContainer" containerID="9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.515273 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.522671 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.529712 4816 scope.go:117] "RemoveContainer" containerID="3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65" Mar 16 00:24:55 crc kubenswrapper[4816]: E0316 00:24:55.530247 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65\": container with ID starting with 3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65 not found: ID does not exist" containerID="3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.530279 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65"} err="failed to get container status \"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65\": rpc error: code = NotFound desc = could not find container \"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65\": container with ID starting with 3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65 not found: ID does not exist" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.530300 4816 scope.go:117] "RemoveContainer" containerID="90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff" Mar 16 00:24:55 crc kubenswrapper[4816]: E0316 00:24:55.530723 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff\": container with ID starting with 90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff not found: ID does not exist" containerID="90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.530768 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff"} err="failed to get container status \"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff\": rpc error: code = NotFound desc = could not find container \"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff\": container with ID starting with 90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff not found: ID does not exist" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.530804 4816 scope.go:117] "RemoveContainer" containerID="9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d" Mar 16 00:24:55 crc kubenswrapper[4816]: E0316 00:24:55.531098 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d\": container with ID starting with 9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d not found: ID does not exist" containerID="9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.531122 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d"} err="failed to get container status \"9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d\": rpc error: code = NotFound desc = could not find container \"9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d\": container with ID starting with 9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d not found: ID does not exist" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.673667 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" path="/var/lib/kubelet/pods/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd/volumes" Mar 16 00:25:01 crc kubenswrapper[4816]: I0316 00:25:01.863001 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:25:01 crc kubenswrapper[4816]: I0316 00:25:01.863481 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:25:04 crc kubenswrapper[4816]: I0316 00:25:04.055773 4816 scope.go:117] "RemoveContainer" containerID="b862cec0bd3d63e5c9dfe4071f9f4f3cb758b083bc3f73a5460bc03b5c4debd8" Mar 16 00:25:07 crc kubenswrapper[4816]: I0316 00:25:07.569120 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4629507-876a-405c-891c-5dcd521cf590" containerID="569da32a07b2521faaf5205b0d1082783f2685ea5a711b0eadcc5491dd41185a" exitCode=0 Mar 16 00:25:07 crc kubenswrapper[4816]: I0316 00:25:07.569254 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerDied","Data":"569da32a07b2521faaf5205b0d1082783f2685ea5a711b0eadcc5491dd41185a"} Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.795418 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815241 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815302 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815361 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7zgw\" (UniqueName: \"kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815383 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815401 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815427 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815446 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815472 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815493 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815526 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815545 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815594 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.816451 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.816496 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.816572 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.816790 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.817761 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.819877 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.822178 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw" (OuterVolumeSpecName: "kube-api-access-h7zgw") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "kube-api-access-h7zgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.822439 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.834956 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.869524 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916400 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916437 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916446 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7zgw\" (UniqueName: \"kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916454 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916463 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916472 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916479 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916487 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916495 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916523 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:09 crc kubenswrapper[4816]: I0316 00:25:09.011117 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:09 crc kubenswrapper[4816]: I0316 00:25:09.017608 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:09 crc kubenswrapper[4816]: I0316 00:25:09.585510 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerDied","Data":"afc880cd89cb3d0dd5c36035edaf726279cc4d27b43fc12a6df286ecc563c314"} Mar 16 00:25:09 crc kubenswrapper[4816]: I0316 00:25:09.585644 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc880cd89cb3d0dd5c36035edaf726279cc4d27b43fc12a6df286ecc563c314" Mar 16 00:25:09 crc kubenswrapper[4816]: I0316 00:25:09.585980 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:25:10 crc kubenswrapper[4816]: I0316 00:25:10.821833 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:10 crc kubenswrapper[4816]: I0316 00:25:10.840239 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.935985 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936239 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="manage-dockerfile" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936250 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="manage-dockerfile" Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936261 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="extract-utilities" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936268 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="extract-utilities" Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936277 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="registry-server" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936283 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="registry-server" Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936294 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="docker-build" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936299 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="docker-build" Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936314 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="extract-content" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936320 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="extract-content" Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936329 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="git-clone" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936335 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="git-clone" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936443 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="docker-build" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936459 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="registry-server" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.937144 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.941391 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.941463 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.941630 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.941738 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.949764 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086213 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086526 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086573 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086605 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086629 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086745 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086769 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086799 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086829 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086851 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086889 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr8b4\" (UniqueName: \"kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188272 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188367 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188456 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188518 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188603 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188662 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188686 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188725 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188774 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188870 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr8b4\" (UniqueName: \"kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188940 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188987 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.189031 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.189223 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.190700 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.190844 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.190944 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.191068 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.191065 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.191443 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.192194 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.196070 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.196717 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.231708 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr8b4\" (UniqueName: \"kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.266699 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.727145 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:15 crc kubenswrapper[4816]: I0316 00:25:15.638765 4816 generic.go:334] "Generic (PLEG): container finished" podID="56c73079-20fb-4653-955c-7c540b94c96d" containerID="8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695" exitCode=0 Mar 16 00:25:15 crc kubenswrapper[4816]: I0316 00:25:15.638885 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"56c73079-20fb-4653-955c-7c540b94c96d","Type":"ContainerDied","Data":"8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695"} Mar 16 00:25:15 crc kubenswrapper[4816]: I0316 00:25:15.639747 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"56c73079-20fb-4653-955c-7c540b94c96d","Type":"ContainerStarted","Data":"a5dfa12c2bb2e35ed6fbc301ba380573c9717ad79ecb931f8c64e95920bec037"} Mar 16 00:25:16 crc kubenswrapper[4816]: I0316 00:25:16.676928 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"56c73079-20fb-4653-955c-7c540b94c96d","Type":"ContainerStarted","Data":"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5"} Mar 16 00:25:16 crc kubenswrapper[4816]: I0316 00:25:16.719756 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.7197272679999998 podStartE2EDuration="3.719727268s" podCreationTimestamp="2026-03-16 00:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:25:16.710184236 +0000 UTC m=+1109.806484259" watchObservedRunningTime="2026-03-16 00:25:16.719727268 +0000 UTC m=+1109.816027261" Mar 16 00:25:24 crc kubenswrapper[4816]: I0316 00:25:24.776934 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:24 crc kubenswrapper[4816]: I0316 00:25:24.777831 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="docker-build" containerID="cri-o://246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5" gracePeriod=30 Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.154815 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_56c73079-20fb-4653-955c-7c540b94c96d/docker-build/0.log" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.155434 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.342489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.342567 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.342611 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.342662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.342697 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344102 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.343409 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344132 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.343718 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.343797 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344179 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344184 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344216 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344216 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344357 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr8b4\" (UniqueName: \"kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344509 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345005 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345017 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345373 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345434 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345447 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345460 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345472 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345483 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345494 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.349684 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.350339 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4" (OuterVolumeSpecName: "kube-api-access-xr8b4") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "kube-api-access-xr8b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.350472 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.447400 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr8b4\" (UniqueName: \"kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.447442 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.447454 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.519499 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.548389 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.738464 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_56c73079-20fb-4653-955c-7c540b94c96d/docker-build/0.log" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.739326 4816 generic.go:334] "Generic (PLEG): container finished" podID="56c73079-20fb-4653-955c-7c540b94c96d" containerID="246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5" exitCode=1 Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.739368 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"56c73079-20fb-4653-955c-7c540b94c96d","Type":"ContainerDied","Data":"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5"} Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.739401 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"56c73079-20fb-4653-955c-7c540b94c96d","Type":"ContainerDied","Data":"a5dfa12c2bb2e35ed6fbc301ba380573c9717ad79ecb931f8c64e95920bec037"} Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.739423 4816 scope.go:117] "RemoveContainer" containerID="246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.739805 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.805910 4816 scope.go:117] "RemoveContainer" containerID="8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.830089 4816 scope.go:117] "RemoveContainer" containerID="246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5" Mar 16 00:25:25 crc kubenswrapper[4816]: E0316 00:25:25.830907 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5\": container with ID starting with 246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5 not found: ID does not exist" containerID="246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.830943 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5"} err="failed to get container status \"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5\": rpc error: code = NotFound desc = could not find container \"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5\": container with ID starting with 246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5 not found: ID does not exist" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.830994 4816 scope.go:117] "RemoveContainer" containerID="8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695" Mar 16 00:25:25 crc kubenswrapper[4816]: E0316 00:25:25.832045 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695\": container with ID starting with 8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695 not found: ID does not exist" containerID="8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.832078 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695"} err="failed to get container status \"8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695\": rpc error: code = NotFound desc = could not find container \"8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695\": container with ID starting with 8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695 not found: ID does not exist" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.886883 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.953079 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.071415 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.077783 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.441855 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:25:26 crc kubenswrapper[4816]: E0316 00:25:26.442165 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="docker-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.442177 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="docker-build" Mar 16 00:25:26 crc kubenswrapper[4816]: E0316 00:25:26.442196 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="manage-dockerfile" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.442203 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="manage-dockerfile" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.442514 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="docker-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.443399 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.445821 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.446081 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.446781 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.446913 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458138 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458258 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458292 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458311 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458333 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458355 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458658 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458902 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.459092 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.459180 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.459236 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xshg5\" (UniqueName: \"kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.468703 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559565 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559613 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xshg5\" (UniqueName: \"kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559635 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559655 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559676 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559683 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559697 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559814 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559841 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559876 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559900 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559929 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560053 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560170 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560505 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560583 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560773 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560840 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560967 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.561141 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.561233 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.564176 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.564723 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.580885 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xshg5\" (UniqueName: \"kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.764380 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.947789 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:25:27 crc kubenswrapper[4816]: I0316 00:25:27.674778 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c73079-20fb-4653-955c-7c540b94c96d" path="/var/lib/kubelet/pods/56c73079-20fb-4653-955c-7c540b94c96d/volumes" Mar 16 00:25:27 crc kubenswrapper[4816]: I0316 00:25:27.755632 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerStarted","Data":"b5940147a61702adb2ae65b7102d9d568ce8f3720835c6af6b9846b8c1561cc9"} Mar 16 00:25:27 crc kubenswrapper[4816]: I0316 00:25:27.755694 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerStarted","Data":"9efbca1d81be5f3f1d30dc2767d52d0c0262c6327e71c0501df25c2d4f40ebae"} Mar 16 00:25:28 crc kubenswrapper[4816]: I0316 00:25:28.763287 4816 generic.go:334] "Generic (PLEG): container finished" podID="d8a56e88-900c-411c-b75c-029cf7bee318" containerID="b5940147a61702adb2ae65b7102d9d568ce8f3720835c6af6b9846b8c1561cc9" exitCode=0 Mar 16 00:25:28 crc kubenswrapper[4816]: I0316 00:25:28.763365 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerDied","Data":"b5940147a61702adb2ae65b7102d9d568ce8f3720835c6af6b9846b8c1561cc9"} Mar 16 00:25:29 crc kubenswrapper[4816]: I0316 00:25:29.769169 4816 generic.go:334] "Generic (PLEG): container finished" podID="d8a56e88-900c-411c-b75c-029cf7bee318" containerID="37e4d06bccc38610b9b68c0996692bfadd6818d8d718c0c8b34e5fe1827d612c" exitCode=0 Mar 16 00:25:29 crc kubenswrapper[4816]: I0316 00:25:29.769402 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerDied","Data":"37e4d06bccc38610b9b68c0996692bfadd6818d8d718c0c8b34e5fe1827d612c"} Mar 16 00:25:29 crc kubenswrapper[4816]: I0316 00:25:29.803756 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_d8a56e88-900c-411c-b75c-029cf7bee318/manage-dockerfile/0.log" Mar 16 00:25:30 crc kubenswrapper[4816]: I0316 00:25:30.777225 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerStarted","Data":"520476cf1fdde0dec794a382c434e78783cc47141e612b047553998dee82f825"} Mar 16 00:25:30 crc kubenswrapper[4816]: I0316 00:25:30.804456 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=4.804436397 podStartE2EDuration="4.804436397s" podCreationTimestamp="2026-03-16 00:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:25:30.802970527 +0000 UTC m=+1123.899270480" watchObservedRunningTime="2026-03-16 00:25:30.804436397 +0000 UTC m=+1123.900736360" Mar 16 00:25:31 crc kubenswrapper[4816]: I0316 00:25:31.862944 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:25:31 crc kubenswrapper[4816]: I0316 00:25:31.863010 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.146121 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560346-hjpvk"] Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.147449 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.150322 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.150534 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.150686 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.157630 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-hjpvk"] Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.306623 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9h62\" (UniqueName: \"kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62\") pod \"auto-csr-approver-29560346-hjpvk\" (UID: \"2942e78f-05b7-486f-bee0-93a942f80d8a\") " pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.408410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9h62\" (UniqueName: \"kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62\") pod \"auto-csr-approver-29560346-hjpvk\" (UID: \"2942e78f-05b7-486f-bee0-93a942f80d8a\") " pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.433506 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9h62\" (UniqueName: \"kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62\") pod \"auto-csr-approver-29560346-hjpvk\" (UID: \"2942e78f-05b7-486f-bee0-93a942f80d8a\") " pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.466348 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.701855 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-hjpvk"] Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.993597 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" event={"ID":"2942e78f-05b7-486f-bee0-93a942f80d8a","Type":"ContainerStarted","Data":"77b5743e696807d6ed2a02e91087afa64269c8810eed60a3b3323d5b46c5c105"} Mar 16 00:26:01 crc kubenswrapper[4816]: I0316 00:26:01.863585 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:26:01 crc kubenswrapper[4816]: I0316 00:26:01.863635 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:26:01 crc kubenswrapper[4816]: I0316 00:26:01.863673 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:26:01 crc kubenswrapper[4816]: I0316 00:26:01.864395 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:26:01 crc kubenswrapper[4816]: I0316 00:26:01.864448 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52" gracePeriod=600 Mar 16 00:26:02 crc kubenswrapper[4816]: I0316 00:26:02.004496 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52" exitCode=0 Mar 16 00:26:02 crc kubenswrapper[4816]: I0316 00:26:02.004594 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52"} Mar 16 00:26:02 crc kubenswrapper[4816]: I0316 00:26:02.004696 4816 scope.go:117] "RemoveContainer" containerID="d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad" Mar 16 00:26:03 crc kubenswrapper[4816]: I0316 00:26:03.011507 4816 generic.go:334] "Generic (PLEG): container finished" podID="2942e78f-05b7-486f-bee0-93a942f80d8a" containerID="0b8b2a24c4f32aff091a974cc84de6242e724aacb4bfa1cc19578627d86a25d5" exitCode=0 Mar 16 00:26:03 crc kubenswrapper[4816]: I0316 00:26:03.011579 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" event={"ID":"2942e78f-05b7-486f-bee0-93a942f80d8a","Type":"ContainerDied","Data":"0b8b2a24c4f32aff091a974cc84de6242e724aacb4bfa1cc19578627d86a25d5"} Mar 16 00:26:03 crc kubenswrapper[4816]: I0316 00:26:03.013802 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc"} Mar 16 00:26:04 crc kubenswrapper[4816]: I0316 00:26:04.396837 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:04 crc kubenswrapper[4816]: I0316 00:26:04.462973 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9h62\" (UniqueName: \"kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62\") pod \"2942e78f-05b7-486f-bee0-93a942f80d8a\" (UID: \"2942e78f-05b7-486f-bee0-93a942f80d8a\") " Mar 16 00:26:04 crc kubenswrapper[4816]: I0316 00:26:04.468718 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62" (OuterVolumeSpecName: "kube-api-access-k9h62") pod "2942e78f-05b7-486f-bee0-93a942f80d8a" (UID: "2942e78f-05b7-486f-bee0-93a942f80d8a"). InnerVolumeSpecName "kube-api-access-k9h62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:26:04 crc kubenswrapper[4816]: I0316 00:26:04.563824 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9h62\" (UniqueName: \"kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.027540 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" event={"ID":"2942e78f-05b7-486f-bee0-93a942f80d8a","Type":"ContainerDied","Data":"77b5743e696807d6ed2a02e91087afa64269c8810eed60a3b3323d5b46c5c105"} Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.027604 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77b5743e696807d6ed2a02e91087afa64269c8810eed60a3b3323d5b46c5c105" Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.027662 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.450070 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-pmlmw"] Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.456875 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-pmlmw"] Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.674694 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc958138-2767-4d7a-8f61-bd16b899189f" path="/var/lib/kubelet/pods/dc958138-2767-4d7a-8f61-bd16b899189f/volumes" Mar 16 00:26:33 crc kubenswrapper[4816]: I0316 00:26:33.233913 4816 generic.go:334] "Generic (PLEG): container finished" podID="d8a56e88-900c-411c-b75c-029cf7bee318" containerID="520476cf1fdde0dec794a382c434e78783cc47141e612b047553998dee82f825" exitCode=0 Mar 16 00:26:33 crc kubenswrapper[4816]: I0316 00:26:33.234000 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerDied","Data":"520476cf1fdde0dec794a382c434e78783cc47141e612b047553998dee82f825"} Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.511383 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515079 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515226 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515286 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515316 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515323 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515604 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.516232 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.526640 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616220 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616269 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xshg5\" (UniqueName: \"kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616299 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616333 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616362 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616380 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616421 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616440 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616661 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616673 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.617359 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.617436 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.617854 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.618050 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.620734 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5" (OuterVolumeSpecName: "kube-api-access-xshg5") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "kube-api-access-xshg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.621250 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.626784 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.681789 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718529 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718584 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718596 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718605 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xshg5\" (UniqueName: \"kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718617 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718626 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718635 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718644 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:35 crc kubenswrapper[4816]: I0316 00:26:35.254984 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerDied","Data":"9efbca1d81be5f3f1d30dc2767d52d0c0262c6327e71c0501df25c2d4f40ebae"} Mar 16 00:26:35 crc kubenswrapper[4816]: I0316 00:26:35.255034 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9efbca1d81be5f3f1d30dc2767d52d0c0262c6327e71c0501df25c2d4f40ebae" Mar 16 00:26:35 crc kubenswrapper[4816]: I0316 00:26:35.255128 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:26:36 crc kubenswrapper[4816]: I0316 00:26:36.466916 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:36 crc kubenswrapper[4816]: I0316 00:26:36.542126 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.184736 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:39 crc kubenswrapper[4816]: E0316 00:26:39.185282 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2942e78f-05b7-486f-bee0-93a942f80d8a" containerName="oc" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185298 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2942e78f-05b7-486f-bee0-93a942f80d8a" containerName="oc" Mar 16 00:26:39 crc kubenswrapper[4816]: E0316 00:26:39.185313 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="docker-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185321 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="docker-build" Mar 16 00:26:39 crc kubenswrapper[4816]: E0316 00:26:39.185333 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="manage-dockerfile" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185342 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="manage-dockerfile" Mar 16 00:26:39 crc kubenswrapper[4816]: E0316 00:26:39.185373 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="git-clone" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185382 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="git-clone" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185513 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2942e78f-05b7-486f-bee0-93a942f80d8a" containerName="oc" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185529 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="docker-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.186408 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.188536 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.189082 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.191027 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.191077 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.202256 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284271 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284322 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284349 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284688 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284757 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284812 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shrhw\" (UniqueName: \"kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284836 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284873 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284900 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284964 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284992 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.285035 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.386957 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387067 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387123 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387180 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387483 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387606 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387660 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shrhw\" (UniqueName: \"kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387705 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387752 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387795 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387845 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.388367 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.388584 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.388714 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.388889 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.389118 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.389851 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.390090 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.391063 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.393439 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.394842 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.418318 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shrhw\" (UniqueName: \"kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.549315 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.773039 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:40 crc kubenswrapper[4816]: I0316 00:26:40.291671 4816 generic.go:334] "Generic (PLEG): container finished" podID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerID="262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe" exitCode=0 Mar 16 00:26:40 crc kubenswrapper[4816]: I0316 00:26:40.291767 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c622ccbe-7da3-4233-905c-bd38932a01ff","Type":"ContainerDied","Data":"262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe"} Mar 16 00:26:40 crc kubenswrapper[4816]: I0316 00:26:40.291952 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c622ccbe-7da3-4233-905c-bd38932a01ff","Type":"ContainerStarted","Data":"8c4febb72e74d39d24c5326cf7e1359f6880581dd8b05174a072a1a50b9f6d8b"} Mar 16 00:26:41 crc kubenswrapper[4816]: I0316 00:26:41.300078 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c622ccbe-7da3-4233-905c-bd38932a01ff","Type":"ContainerStarted","Data":"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08"} Mar 16 00:26:41 crc kubenswrapper[4816]: I0316 00:26:41.323943 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=2.323920757 podStartE2EDuration="2.323920757s" podCreationTimestamp="2026-03-16 00:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:26:41.320154813 +0000 UTC m=+1194.416454756" watchObservedRunningTime="2026-03-16 00:26:41.323920757 +0000 UTC m=+1194.420220710" Mar 16 00:26:49 crc kubenswrapper[4816]: I0316 00:26:49.580526 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:49 crc kubenswrapper[4816]: I0316 00:26:49.581238 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="docker-build" containerID="cri-o://53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08" gracePeriod=30 Mar 16 00:26:49 crc kubenswrapper[4816]: I0316 00:26:49.940263 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_c622ccbe-7da3-4233-905c-bd38932a01ff/docker-build/0.log" Mar 16 00:26:49 crc kubenswrapper[4816]: I0316 00:26:49.941078 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.159872 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.160379 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.160614 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shrhw\" (UniqueName: \"kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.160692 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.160727 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.160916 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.161417 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.161600 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.161683 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.161740 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.161903 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162149 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162274 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162328 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162422 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162427 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162921 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163170 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163201 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163098 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163341 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163347 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163377 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.166100 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.166959 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw" (OuterVolumeSpecName: "kube-api-access-shrhw") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "kube-api-access-shrhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.169776 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264819 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264856 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264871 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shrhw\" (UniqueName: \"kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264883 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264894 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264904 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.271889 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.365510 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_c622ccbe-7da3-4233-905c-bd38932a01ff/docker-build/0.log" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.365734 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.366363 4816 generic.go:334] "Generic (PLEG): container finished" podID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerID="53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08" exitCode=1 Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.366418 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c622ccbe-7da3-4233-905c-bd38932a01ff","Type":"ContainerDied","Data":"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08"} Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.366478 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c622ccbe-7da3-4233-905c-bd38932a01ff","Type":"ContainerDied","Data":"8c4febb72e74d39d24c5326cf7e1359f6880581dd8b05174a072a1a50b9f6d8b"} Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.366476 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.366501 4816 scope.go:117] "RemoveContainer" containerID="53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.410628 4816 scope.go:117] "RemoveContainer" containerID="262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.433889 4816 scope.go:117] "RemoveContainer" containerID="53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08" Mar 16 00:26:50 crc kubenswrapper[4816]: E0316 00:26:50.434355 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08\": container with ID starting with 53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08 not found: ID does not exist" containerID="53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.434410 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08"} err="failed to get container status \"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08\": rpc error: code = NotFound desc = could not find container \"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08\": container with ID starting with 53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08 not found: ID does not exist" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.434439 4816 scope.go:117] "RemoveContainer" containerID="262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe" Mar 16 00:26:50 crc kubenswrapper[4816]: E0316 00:26:50.434947 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe\": container with ID starting with 262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe not found: ID does not exist" containerID="262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.435000 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe"} err="failed to get container status \"262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe\": rpc error: code = NotFound desc = could not find container \"262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe\": container with ID starting with 262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe not found: ID does not exist" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.608533 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.668922 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.711786 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.719785 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.330734 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:26:51 crc kubenswrapper[4816]: E0316 00:26:51.330992 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="docker-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.331006 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="docker-build" Mar 16 00:26:51 crc kubenswrapper[4816]: E0316 00:26:51.331022 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="manage-dockerfile" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.331031 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="manage-dockerfile" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.331168 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="docker-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.332078 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.334430 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.334638 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.334800 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.334996 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.358198 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.376799 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.376866 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.376995 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377052 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377121 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377152 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377212 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377258 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377277 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6hxr\" (UniqueName: \"kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377317 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377354 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377382 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477771 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477825 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477873 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477893 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477912 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477933 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477954 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477971 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6hxr\" (UniqueName: \"kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477989 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478027 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478100 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478324 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478353 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478716 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478827 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.479166 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.479173 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.479642 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.479897 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.482222 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.482236 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.494924 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6hxr\" (UniqueName: \"kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.665611 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.675012 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" path="/var/lib/kubelet/pods/c622ccbe-7da3-4233-905c-bd38932a01ff/volumes" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.851154 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:26:52 crc kubenswrapper[4816]: I0316 00:26:52.387971 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerStarted","Data":"3cc439cbecc1d9ee2a5bd6390fe4f76ea33e05cf5c2ab078d9693997d08ecf9a"} Mar 16 00:26:52 crc kubenswrapper[4816]: I0316 00:26:52.388024 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerStarted","Data":"ae3c6d8445adc1875e9fd69aeb8761204c220a359e0c05ee64563ec952a146ae"} Mar 16 00:26:53 crc kubenswrapper[4816]: I0316 00:26:53.396096 4816 generic.go:334] "Generic (PLEG): container finished" podID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerID="3cc439cbecc1d9ee2a5bd6390fe4f76ea33e05cf5c2ab078d9693997d08ecf9a" exitCode=0 Mar 16 00:26:53 crc kubenswrapper[4816]: I0316 00:26:53.396144 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerDied","Data":"3cc439cbecc1d9ee2a5bd6390fe4f76ea33e05cf5c2ab078d9693997d08ecf9a"} Mar 16 00:26:54 crc kubenswrapper[4816]: I0316 00:26:54.405873 4816 generic.go:334] "Generic (PLEG): container finished" podID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerID="55a81fa5148d34c2655ea1ac667509031445a92bd58b8398676af8c464844e7c" exitCode=0 Mar 16 00:26:54 crc kubenswrapper[4816]: I0316 00:26:54.405921 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerDied","Data":"55a81fa5148d34c2655ea1ac667509031445a92bd58b8398676af8c464844e7c"} Mar 16 00:26:54 crc kubenswrapper[4816]: I0316 00:26:54.449245 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_f1394889-b25e-4a90-ad3b-651e20e8ad20/manage-dockerfile/0.log" Mar 16 00:26:55 crc kubenswrapper[4816]: I0316 00:26:55.418829 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerStarted","Data":"9f92516e42ba33b1d2e8579fa9c2dd369bfb873ad948e6fca41db7e816622c1e"} Mar 16 00:26:55 crc kubenswrapper[4816]: I0316 00:26:55.464216 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=4.464198626 podStartE2EDuration="4.464198626s" podCreationTimestamp="2026-03-16 00:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:26:55.459140565 +0000 UTC m=+1208.555440538" watchObservedRunningTime="2026-03-16 00:26:55.464198626 +0000 UTC m=+1208.560498589" Mar 16 00:27:04 crc kubenswrapper[4816]: I0316 00:27:04.144316 4816 scope.go:117] "RemoveContainer" containerID="4565949d11f1fa384d67b3420395f0c07c9d2ee22190f1a94b2e1bc9e4c10a96" Mar 16 00:27:21 crc kubenswrapper[4816]: E0316 00:27:21.987691 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1394889_b25e_4a90_ad3b_651e20e8ad20.slice/buildah-buildah3198239443\": RecentStats: unable to find data in memory cache]" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.133776 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560348-xvv6w"] Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.135292 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.137457 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.138078 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.140763 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.141582 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-xvv6w"] Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.190564 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fg2\" (UniqueName: \"kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2\") pod \"auto-csr-approver-29560348-xvv6w\" (UID: \"a529fd1f-66e5-4e49-b95a-18c6a8aade4b\") " pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.291214 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fg2\" (UniqueName: \"kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2\") pod \"auto-csr-approver-29560348-xvv6w\" (UID: \"a529fd1f-66e5-4e49-b95a-18c6a8aade4b\") " pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.314098 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fg2\" (UniqueName: \"kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2\") pod \"auto-csr-approver-29560348-xvv6w\" (UID: \"a529fd1f-66e5-4e49-b95a-18c6a8aade4b\") " pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.450937 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.681310 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-xvv6w"] Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.687805 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.853811 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" event={"ID":"a529fd1f-66e5-4e49-b95a-18c6a8aade4b","Type":"ContainerStarted","Data":"c0597947f63d22a5a4d56c7617834f9f25d8bc5ca2d1f02506b343df4dd98c86"} Mar 16 00:28:02 crc kubenswrapper[4816]: I0316 00:28:02.866999 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" event={"ID":"a529fd1f-66e5-4e49-b95a-18c6a8aade4b","Type":"ContainerStarted","Data":"2184a6c7d5ea889f0c49670caabfc30e2cdb52bf2b9beb7864557d83b84bbb54"} Mar 16 00:28:02 crc kubenswrapper[4816]: I0316 00:28:02.881252 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" podStartSLOduration=1.099234435 podStartE2EDuration="2.881237333s" podCreationTimestamp="2026-03-16 00:28:00 +0000 UTC" firstStartedPulling="2026-03-16 00:28:00.687616033 +0000 UTC m=+1273.783915986" lastFinishedPulling="2026-03-16 00:28:02.469618931 +0000 UTC m=+1275.565918884" observedRunningTime="2026-03-16 00:28:02.877239071 +0000 UTC m=+1275.973539024" watchObservedRunningTime="2026-03-16 00:28:02.881237333 +0000 UTC m=+1275.977537286" Mar 16 00:28:03 crc kubenswrapper[4816]: I0316 00:28:03.873303 4816 generic.go:334] "Generic (PLEG): container finished" podID="a529fd1f-66e5-4e49-b95a-18c6a8aade4b" containerID="2184a6c7d5ea889f0c49670caabfc30e2cdb52bf2b9beb7864557d83b84bbb54" exitCode=0 Mar 16 00:28:03 crc kubenswrapper[4816]: I0316 00:28:03.873352 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" event={"ID":"a529fd1f-66e5-4e49-b95a-18c6a8aade4b","Type":"ContainerDied","Data":"2184a6c7d5ea889f0c49670caabfc30e2cdb52bf2b9beb7864557d83b84bbb54"} Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.087824 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.163832 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5fg2\" (UniqueName: \"kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2\") pod \"a529fd1f-66e5-4e49-b95a-18c6a8aade4b\" (UID: \"a529fd1f-66e5-4e49-b95a-18c6a8aade4b\") " Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.169300 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2" (OuterVolumeSpecName: "kube-api-access-s5fg2") pod "a529fd1f-66e5-4e49-b95a-18c6a8aade4b" (UID: "a529fd1f-66e5-4e49-b95a-18c6a8aade4b"). InnerVolumeSpecName "kube-api-access-s5fg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.265677 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5fg2\" (UniqueName: \"kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.887171 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" event={"ID":"a529fd1f-66e5-4e49-b95a-18c6a8aade4b","Type":"ContainerDied","Data":"c0597947f63d22a5a4d56c7617834f9f25d8bc5ca2d1f02506b343df4dd98c86"} Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.887203 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0597947f63d22a5a4d56c7617834f9f25d8bc5ca2d1f02506b343df4dd98c86" Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.887208 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.937838 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-qq7qg"] Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.943741 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-qq7qg"] Mar 16 00:28:07 crc kubenswrapper[4816]: I0316 00:28:07.680910 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60f1a00-e9c6-46ff-b5eb-f3c680f04736" path="/var/lib/kubelet/pods/d60f1a00-e9c6-46ff-b5eb-f3c680f04736/volumes" Mar 16 00:28:31 crc kubenswrapper[4816]: I0316 00:28:31.863776 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:28:31 crc kubenswrapper[4816]: I0316 00:28:31.864256 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:01 crc kubenswrapper[4816]: I0316 00:29:01.863479 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:29:01 crc kubenswrapper[4816]: I0316 00:29:01.864165 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:04 crc kubenswrapper[4816]: I0316 00:29:04.230406 4816 scope.go:117] "RemoveContainer" containerID="dbd7c0bfa602e132787d7d6d843e255ebdb6acf34354466437ff4e5db80a17a7" Mar 16 00:29:31 crc kubenswrapper[4816]: I0316 00:29:31.863574 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:29:31 crc kubenswrapper[4816]: I0316 00:29:31.865010 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:31 crc kubenswrapper[4816]: I0316 00:29:31.865117 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:29:31 crc kubenswrapper[4816]: I0316 00:29:31.865778 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:29:31 crc kubenswrapper[4816]: I0316 00:29:31.865924 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc" gracePeriod=600 Mar 16 00:29:32 crc kubenswrapper[4816]: I0316 00:29:32.499073 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc" exitCode=0 Mar 16 00:29:32 crc kubenswrapper[4816]: I0316 00:29:32.499114 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc"} Mar 16 00:29:32 crc kubenswrapper[4816]: I0316 00:29:32.499147 4816 scope.go:117] "RemoveContainer" containerID="d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52" Mar 16 00:29:33 crc kubenswrapper[4816]: I0316 00:29:33.508180 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071"} Mar 16 00:29:59 crc kubenswrapper[4816]: I0316 00:29:59.712814 4816 generic.go:334] "Generic (PLEG): container finished" podID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerID="9f92516e42ba33b1d2e8579fa9c2dd369bfb873ad948e6fca41db7e816622c1e" exitCode=0 Mar 16 00:29:59 crc kubenswrapper[4816]: I0316 00:29:59.712902 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerDied","Data":"9f92516e42ba33b1d2e8579fa9c2dd369bfb873ad948e6fca41db7e816622c1e"} Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.151345 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560350-6dpp5"] Mar 16 00:30:00 crc kubenswrapper[4816]: E0316 00:30:00.151924 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a529fd1f-66e5-4e49-b95a-18c6a8aade4b" containerName="oc" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.151940 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a529fd1f-66e5-4e49-b95a-18c6a8aade4b" containerName="oc" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.157088 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a529fd1f-66e5-4e49-b95a-18c6a8aade4b" containerName="oc" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.158351 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.165925 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.166251 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.166585 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.169232 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp"] Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.170475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.173213 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.173343 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.177732 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-6dpp5"] Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.183662 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp"] Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.247290 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6zlg\" (UniqueName: \"kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg\") pod \"auto-csr-approver-29560350-6dpp5\" (UID: \"12bfc435-89c2-4917-9bb6-cc2e9eca440c\") " pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.247379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64s4v\" (UniqueName: \"kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.247442 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.247462 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.349392 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6zlg\" (UniqueName: \"kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg\") pod \"auto-csr-approver-29560350-6dpp5\" (UID: \"12bfc435-89c2-4917-9bb6-cc2e9eca440c\") " pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.349488 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64s4v\" (UniqueName: \"kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.349653 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.349700 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.351855 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.363542 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.379706 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64s4v\" (UniqueName: \"kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.380598 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6zlg\" (UniqueName: \"kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg\") pod \"auto-csr-approver-29560350-6dpp5\" (UID: \"12bfc435-89c2-4917-9bb6-cc2e9eca440c\") " pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.493745 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.507236 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.693812 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-6dpp5"] Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.721086 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" event={"ID":"12bfc435-89c2-4917-9bb6-cc2e9eca440c","Type":"ContainerStarted","Data":"49cea63e1c43a10078ab745d499a1cd66311bccc4b9367191a210448cc27ed33"} Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.937680 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.945982 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp"] Mar 16 00:30:00 crc kubenswrapper[4816]: W0316 00:30:00.950419 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc27926cb_7a0c_4dff_a823_0c9cfdb9977c.slice/crio-c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27 WatchSource:0}: Error finding container c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27: Status 404 returned error can't find the container with id c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27 Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.959714 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6hxr\" (UniqueName: \"kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.959930 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960071 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960134 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960191 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960304 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960385 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960456 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960529 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960612 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960624 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960721 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960758 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.961180 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.961215 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.962835 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.963707 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.966893 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.967757 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.968185 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.969685 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.972445 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr" (OuterVolumeSpecName: "kube-api-access-r6hxr") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "kube-api-access-r6hxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.986880 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063352 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063393 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063405 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063425 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063434 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063442 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063504 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063515 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063584 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6hxr\" (UniqueName: \"kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.276783 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.367440 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.730661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerDied","Data":"ae3c6d8445adc1875e9fd69aeb8761204c220a359e0c05ee64563ec952a146ae"} Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.730716 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae3c6d8445adc1875e9fd69aeb8761204c220a359e0c05ee64563ec952a146ae" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.730864 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.732586 4816 generic.go:334] "Generic (PLEG): container finished" podID="c27926cb-7a0c-4dff-a823-0c9cfdb9977c" containerID="d0c63cafc91b5e5581126cae772dc10861d08f21c797798bd26dc16d3fd85d6a" exitCode=0 Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.732633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" event={"ID":"c27926cb-7a0c-4dff-a823-0c9cfdb9977c","Type":"ContainerDied","Data":"d0c63cafc91b5e5581126cae772dc10861d08f21c797798bd26dc16d3fd85d6a"} Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.732660 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" event={"ID":"c27926cb-7a0c-4dff-a823-0c9cfdb9977c","Type":"ContainerStarted","Data":"c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27"} Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.740276 4816 generic.go:334] "Generic (PLEG): container finished" podID="12bfc435-89c2-4917-9bb6-cc2e9eca440c" containerID="a1355d11ec449f6a9fd6597a935b6361539d556da9968192441a1a7760e23960" exitCode=0 Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.740487 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" event={"ID":"12bfc435-89c2-4917-9bb6-cc2e9eca440c","Type":"ContainerDied","Data":"a1355d11ec449f6a9fd6597a935b6361539d556da9968192441a1a7760e23960"} Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.953696 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.988287 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64s4v\" (UniqueName: \"kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v\") pod \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.988349 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume\") pod \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.988405 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume\") pod \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.989127 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume" (OuterVolumeSpecName: "config-volume") pod "c27926cb-7a0c-4dff-a823-0c9cfdb9977c" (UID: "c27926cb-7a0c-4dff-a823-0c9cfdb9977c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.993780 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c27926cb-7a0c-4dff-a823-0c9cfdb9977c" (UID: "c27926cb-7a0c-4dff-a823-0c9cfdb9977c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.995446 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v" (OuterVolumeSpecName: "kube-api-access-64s4v") pod "c27926cb-7a0c-4dff-a823-0c9cfdb9977c" (UID: "c27926cb-7a0c-4dff-a823-0c9cfdb9977c"). InnerVolumeSpecName "kube-api-access-64s4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.089240 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64s4v\" (UniqueName: \"kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.089269 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.089279 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.464632 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.493842 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.750127 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.750116 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" event={"ID":"c27926cb-7a0c-4dff-a823-0c9cfdb9977c","Type":"ContainerDied","Data":"c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27"} Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.750327 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.956678 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:03.998722 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6zlg\" (UniqueName: \"kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg\") pod \"12bfc435-89c2-4917-9bb6-cc2e9eca440c\" (UID: \"12bfc435-89c2-4917-9bb6-cc2e9eca440c\") " Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:04.019836 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg" (OuterVolumeSpecName: "kube-api-access-c6zlg") pod "12bfc435-89c2-4917-9bb6-cc2e9eca440c" (UID: "12bfc435-89c2-4917-9bb6-cc2e9eca440c"). InnerVolumeSpecName "kube-api-access-c6zlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:04.103344 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6zlg\" (UniqueName: \"kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:04.759587 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" event={"ID":"12bfc435-89c2-4917-9bb6-cc2e9eca440c","Type":"ContainerDied","Data":"49cea63e1c43a10078ab745d499a1cd66311bccc4b9367191a210448cc27ed33"} Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:04.759634 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:04.759638 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49cea63e1c43a10078ab745d499a1cd66311bccc4b9367191a210448cc27ed33" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.013659 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-qmt9b"] Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.022672 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-qmt9b"] Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.676534 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" path="/var/lib/kubelet/pods/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d/volumes" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807348 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:05 crc kubenswrapper[4816]: E0316 00:30:05.807611 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="git-clone" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807622 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="git-clone" Mar 16 00:30:05 crc kubenswrapper[4816]: E0316 00:30:05.807636 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="docker-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807642 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="docker-build" Mar 16 00:30:05 crc kubenswrapper[4816]: E0316 00:30:05.807652 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="manage-dockerfile" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807658 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="manage-dockerfile" Mar 16 00:30:05 crc kubenswrapper[4816]: E0316 00:30:05.807670 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27926cb-7a0c-4dff-a823-0c9cfdb9977c" containerName="collect-profiles" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807675 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27926cb-7a0c-4dff-a823-0c9cfdb9977c" containerName="collect-profiles" Mar 16 00:30:05 crc kubenswrapper[4816]: E0316 00:30:05.807683 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bfc435-89c2-4917-9bb6-cc2e9eca440c" containerName="oc" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807688 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bfc435-89c2-4917-9bb6-cc2e9eca440c" containerName="oc" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807784 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="12bfc435-89c2-4917-9bb6-cc2e9eca440c" containerName="oc" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807792 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c27926cb-7a0c-4dff-a823-0c9cfdb9977c" containerName="collect-profiles" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807805 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="docker-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.808357 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.810905 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.812827 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.813065 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.813305 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.827858 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.827923 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.827957 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.827984 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828067 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828213 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828247 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828275 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828302 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828389 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2jdp\" (UniqueName: \"kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828531 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828649 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828691 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930138 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930251 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930289 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930325 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930355 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930384 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930419 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930479 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2jdp\" (UniqueName: \"kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930525 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930583 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930624 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930663 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930933 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930965 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931035 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931045 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931295 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931508 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931644 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931881 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.934183 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.937162 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.960774 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2jdp\" (UniqueName: \"kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:06 crc kubenswrapper[4816]: I0316 00:30:06.129038 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:06 crc kubenswrapper[4816]: I0316 00:30:06.381115 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:06 crc kubenswrapper[4816]: W0316 00:30:06.387346 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a12a3fd_22a7_4cc3_ac53_7463cafa502b.slice/crio-8a107ef141b4ece40f15dcdf9c4c87527a33a8b3c6765c1fce2f47803bc47549 WatchSource:0}: Error finding container 8a107ef141b4ece40f15dcdf9c4c87527a33a8b3c6765c1fce2f47803bc47549: Status 404 returned error can't find the container with id 8a107ef141b4ece40f15dcdf9c4c87527a33a8b3c6765c1fce2f47803bc47549 Mar 16 00:30:06 crc kubenswrapper[4816]: I0316 00:30:06.771975 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8a12a3fd-22a7-4cc3-ac53-7463cafa502b","Type":"ContainerStarted","Data":"8a107ef141b4ece40f15dcdf9c4c87527a33a8b3c6765c1fce2f47803bc47549"} Mar 16 00:30:07 crc kubenswrapper[4816]: I0316 00:30:07.783716 4816 generic.go:334] "Generic (PLEG): container finished" podID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerID="dbea9dcde116242b7d23230041fd331ed37ce34d53b83f1909398b83e6d4d7ee" exitCode=0 Mar 16 00:30:07 crc kubenswrapper[4816]: I0316 00:30:07.783869 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8a12a3fd-22a7-4cc3-ac53-7463cafa502b","Type":"ContainerDied","Data":"dbea9dcde116242b7d23230041fd331ed37ce34d53b83f1909398b83e6d4d7ee"} Mar 16 00:30:08 crc kubenswrapper[4816]: I0316 00:30:08.794256 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8a12a3fd-22a7-4cc3-ac53-7463cafa502b","Type":"ContainerStarted","Data":"4b795c4bb6bb53330ab38747d1f1ac7c12b7bd727854ad62563f6bcf284e3e15"} Mar 16 00:30:08 crc kubenswrapper[4816]: I0316 00:30:08.822142 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.822117102 podStartE2EDuration="3.822117102s" podCreationTimestamp="2026-03-16 00:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:30:08.81894139 +0000 UTC m=+1401.915241363" watchObservedRunningTime="2026-03-16 00:30:08.822117102 +0000 UTC m=+1401.918417075" Mar 16 00:30:15 crc kubenswrapper[4816]: I0316 00:30:15.622452 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:15 crc kubenswrapper[4816]: I0316 00:30:15.623629 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="docker-build" containerID="cri-o://4b795c4bb6bb53330ab38747d1f1ac7c12b7bd727854ad62563f6bcf284e3e15" gracePeriod=30 Mar 16 00:30:15 crc kubenswrapper[4816]: I0316 00:30:15.863279 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_8a12a3fd-22a7-4cc3-ac53-7463cafa502b/docker-build/0.log" Mar 16 00:30:15 crc kubenswrapper[4816]: I0316 00:30:15.867123 4816 generic.go:334] "Generic (PLEG): container finished" podID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerID="4b795c4bb6bb53330ab38747d1f1ac7c12b7bd727854ad62563f6bcf284e3e15" exitCode=1 Mar 16 00:30:15 crc kubenswrapper[4816]: I0316 00:30:15.867175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8a12a3fd-22a7-4cc3-ac53-7463cafa502b","Type":"ContainerDied","Data":"4b795c4bb6bb53330ab38747d1f1ac7c12b7bd727854ad62563f6bcf284e3e15"} Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.061077 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_8a12a3fd-22a7-4cc3-ac53-7463cafa502b/docker-build/0.log" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.061395 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.096168 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.096242 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.096321 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2jdp\" (UniqueName: \"kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.096475 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098659 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098721 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098745 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098768 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098802 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098823 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098839 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098855 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.099244 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.100140 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.100245 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.100416 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.100913 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.101174 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.101237 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.102360 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.102721 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp" (OuterVolumeSpecName: "kube-api-access-f2jdp") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "kube-api-access-f2jdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.103099 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.173219 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200411 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200458 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200473 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200486 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200501 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200541 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200572 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200585 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200598 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200610 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200622 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2jdp\" (UniqueName: \"kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.511236 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.512197 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.877319 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_8a12a3fd-22a7-4cc3-ac53-7463cafa502b/docker-build/0.log" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.877991 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8a12a3fd-22a7-4cc3-ac53-7463cafa502b","Type":"ContainerDied","Data":"8a107ef141b4ece40f15dcdf9c4c87527a33a8b3c6765c1fce2f47803bc47549"} Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.878045 4816 scope.go:117] "RemoveContainer" containerID="4b795c4bb6bb53330ab38747d1f1ac7c12b7bd727854ad62563f6bcf284e3e15" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.878164 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.908644 4816 scope.go:117] "RemoveContainer" containerID="dbea9dcde116242b7d23230041fd331ed37ce34d53b83f1909398b83e6d4d7ee" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.928806 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.952089 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.281317 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:30:17 crc kubenswrapper[4816]: E0316 00:30:17.298499 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="docker-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.298788 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="docker-build" Mar 16 00:30:17 crc kubenswrapper[4816]: E0316 00:30:17.298908 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="manage-dockerfile" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.299009 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="manage-dockerfile" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.299920 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="docker-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.301948 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.312808 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.314178 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.314592 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.314975 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.315492 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325256 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325320 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325359 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325390 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325425 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325447 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325468 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvt9g\" (UniqueName: \"kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325496 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325526 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325582 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325605 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325643 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.427400 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.427901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.428143 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.428355 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.428513 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.428826 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429059 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429279 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429472 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429728 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvt9g\" (UniqueName: \"kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429995 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.430283 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.430518 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.430767 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.430071 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429656 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.431947 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.432292 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.437233 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.437306 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.437535 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.437948 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.437977 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.444771 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvt9g\" (UniqueName: \"kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.624316 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.675565 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" path="/var/lib/kubelet/pods/8a12a3fd-22a7-4cc3-ac53-7463cafa502b/volumes" Mar 16 00:30:18 crc kubenswrapper[4816]: I0316 00:30:18.044404 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:30:18 crc kubenswrapper[4816]: I0316 00:30:18.897144 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerStarted","Data":"e670917e4f348cf3256ada42b19e331e3cfaa4ac463d6f46f290fec2ade196ca"} Mar 16 00:30:18 crc kubenswrapper[4816]: I0316 00:30:18.897377 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerStarted","Data":"b69b4af40f3f142efec97d6238cfcbf6adef1518048da6359bdb13cfce32e6b9"} Mar 16 00:30:19 crc kubenswrapper[4816]: E0316 00:30:19.010032 4816 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:52852->38.102.83.158:35591: write tcp 38.102.83.158:52852->38.102.83.158:35591: write: connection reset by peer Mar 16 00:30:19 crc kubenswrapper[4816]: I0316 00:30:19.906844 4816 generic.go:334] "Generic (PLEG): container finished" podID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerID="e670917e4f348cf3256ada42b19e331e3cfaa4ac463d6f46f290fec2ade196ca" exitCode=0 Mar 16 00:30:19 crc kubenswrapper[4816]: I0316 00:30:19.906910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerDied","Data":"e670917e4f348cf3256ada42b19e331e3cfaa4ac463d6f46f290fec2ade196ca"} Mar 16 00:30:20 crc kubenswrapper[4816]: I0316 00:30:20.917102 4816 generic.go:334] "Generic (PLEG): container finished" podID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerID="3f1fc959adebb5ee4efbb86c12974091024b401490dc7b39e795e6fb43175c7f" exitCode=0 Mar 16 00:30:20 crc kubenswrapper[4816]: I0316 00:30:20.917149 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerDied","Data":"3f1fc959adebb5ee4efbb86c12974091024b401490dc7b39e795e6fb43175c7f"} Mar 16 00:30:20 crc kubenswrapper[4816]: I0316 00:30:20.989187 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_e28d6969-ebed-4cf6-bb79-47e69bd952b9/manage-dockerfile/0.log" Mar 16 00:30:21 crc kubenswrapper[4816]: I0316 00:30:21.926696 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerStarted","Data":"9281601de1505bb064fa6238e6ff325c949730a49bc3d2962b8ca1dead5e53a7"} Mar 16 00:30:21 crc kubenswrapper[4816]: I0316 00:30:21.963364 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=4.963343551 podStartE2EDuration="4.963343551s" podCreationTimestamp="2026-03-16 00:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:30:21.961826042 +0000 UTC m=+1415.058126005" watchObservedRunningTime="2026-03-16 00:30:21.963343551 +0000 UTC m=+1415.059643504" Mar 16 00:31:04 crc kubenswrapper[4816]: I0316 00:31:04.311507 4816 scope.go:117] "RemoveContainer" containerID="6050167de1d894cd0016711271e17ed54f0e6320bd8403d36883159d39c3c966" Mar 16 00:31:06 crc kubenswrapper[4816]: I0316 00:31:06.254279 4816 generic.go:334] "Generic (PLEG): container finished" podID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerID="9281601de1505bb064fa6238e6ff325c949730a49bc3d2962b8ca1dead5e53a7" exitCode=0 Mar 16 00:31:06 crc kubenswrapper[4816]: I0316 00:31:06.254342 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerDied","Data":"9281601de1505bb064fa6238e6ff325c949730a49bc3d2962b8ca1dead5e53a7"} Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.489800 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625351 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625409 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625429 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625448 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625475 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625723 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625764 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625797 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625827 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625798 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625852 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625887 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvt9g\" (UniqueName: \"kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625919 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.626195 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.626213 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.626298 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.626762 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.627881 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.628724 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.629092 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.631078 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g" (OuterVolumeSpecName: "kube-api-access-xvt9g") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "kube-api-access-xvt9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.636307 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.637167 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.731315 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732212 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732229 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732240 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732249 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732257 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732266 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732276 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732284 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732292 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvt9g\" (UniqueName: \"kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732300 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:08 crc kubenswrapper[4816]: I0316 00:31:08.269572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerDied","Data":"b69b4af40f3f142efec97d6238cfcbf6adef1518048da6359bdb13cfce32e6b9"} Mar 16 00:31:08 crc kubenswrapper[4816]: I0316 00:31:08.269619 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b69b4af40f3f142efec97d6238cfcbf6adef1518048da6359bdb13cfce32e6b9" Mar 16 00:31:08 crc kubenswrapper[4816]: I0316 00:31:08.269743 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:31:08 crc kubenswrapper[4816]: I0316 00:31:08.383516 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:08 crc kubenswrapper[4816]: I0316 00:31:08.443544 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.637991 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:11 crc kubenswrapper[4816]: E0316 00:31:11.638444 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="manage-dockerfile" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.638456 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="manage-dockerfile" Mar 16 00:31:11 crc kubenswrapper[4816]: E0316 00:31:11.638475 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="docker-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.638481 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="docker-build" Mar 16 00:31:11 crc kubenswrapper[4816]: E0316 00:31:11.638490 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="git-clone" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.638496 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="git-clone" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.638599 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="docker-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.639176 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.642413 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.642449 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.642742 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.649592 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.661407 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686222 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686522 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686643 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686870 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686968 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687095 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687183 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbrs7\" (UniqueName: \"kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687300 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687393 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687491 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687602 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788516 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788592 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788622 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788658 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788692 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788715 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788754 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788775 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbrs7\" (UniqueName: \"kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788808 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788833 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788976 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.789026 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.789512 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.789596 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.789763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.789880 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.790589 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.790167 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.790836 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.794947 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.796297 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.808947 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbrs7\" (UniqueName: \"kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.952062 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:12 crc kubenswrapper[4816]: I0316 00:31:12.399500 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:13 crc kubenswrapper[4816]: I0316 00:31:13.312359 4816 generic.go:334] "Generic (PLEG): container finished" podID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerID="f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78" exitCode=0 Mar 16 00:31:13 crc kubenswrapper[4816]: I0316 00:31:13.312412 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3","Type":"ContainerDied","Data":"f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78"} Mar 16 00:31:13 crc kubenswrapper[4816]: I0316 00:31:13.312443 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3","Type":"ContainerStarted","Data":"65c516056523a7bbc848a7d18a23fbbd58a3dce1c2461275ee579b7c5dc75a6c"} Mar 16 00:31:14 crc kubenswrapper[4816]: I0316 00:31:14.320824 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3","Type":"ContainerStarted","Data":"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7"} Mar 16 00:31:14 crc kubenswrapper[4816]: I0316 00:31:14.356664 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.356644396 podStartE2EDuration="3.356644396s" podCreationTimestamp="2026-03-16 00:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:31:14.351181385 +0000 UTC m=+1467.447481348" watchObservedRunningTime="2026-03-16 00:31:14.356644396 +0000 UTC m=+1467.452944349" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.362698 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.363427 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="docker-build" containerID="cri-o://d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7" gracePeriod=30 Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.701492 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3/docker-build/0.log" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.702057 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766451 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766563 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766570 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766593 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766632 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766689 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766758 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766788 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766808 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766884 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766904 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbrs7\" (UniqueName: \"kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766925 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766963 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.767855 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.767988 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.768027 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.768049 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.768189 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.768649 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.769171 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.771909 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.773271 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7" (OuterVolumeSpecName: "kube-api-access-sbrs7") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "kube-api-access-sbrs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.773650 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.841796 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.869913 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.869971 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.869985 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870002 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870016 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870028 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbrs7\" (UniqueName: \"kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870037 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870046 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870055 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.140127 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.173538 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.390221 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3/docker-build/0.log" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.391005 4816 generic.go:334] "Generic (PLEG): container finished" podID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerID="d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7" exitCode=1 Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.391048 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3","Type":"ContainerDied","Data":"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7"} Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.391084 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.391102 4816 scope.go:117] "RemoveContainer" containerID="d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.391089 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3","Type":"ContainerDied","Data":"65c516056523a7bbc848a7d18a23fbbd58a3dce1c2461275ee579b7c5dc75a6c"} Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.427921 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.431911 4816 scope.go:117] "RemoveContainer" containerID="f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.434626 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.456858 4816 scope.go:117] "RemoveContainer" containerID="d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7" Mar 16 00:31:23 crc kubenswrapper[4816]: E0316 00:31:23.457465 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7\": container with ID starting with d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7 not found: ID does not exist" containerID="d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.457513 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7"} err="failed to get container status \"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7\": rpc error: code = NotFound desc = could not find container \"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7\": container with ID starting with d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7 not found: ID does not exist" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.457545 4816 scope.go:117] "RemoveContainer" containerID="f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78" Mar 16 00:31:23 crc kubenswrapper[4816]: E0316 00:31:23.458091 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78\": container with ID starting with f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78 not found: ID does not exist" containerID="f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.458142 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78"} err="failed to get container status \"f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78\": rpc error: code = NotFound desc = could not find container \"f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78\": container with ID starting with f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78 not found: ID does not exist" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.682086 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" path="/var/lib/kubelet/pods/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3/volumes" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.978682 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:31:23 crc kubenswrapper[4816]: E0316 00:31:23.978887 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="docker-build" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.978899 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="docker-build" Mar 16 00:31:23 crc kubenswrapper[4816]: E0316 00:31:23.978911 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="manage-dockerfile" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.978918 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="manage-dockerfile" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.979015 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="docker-build" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.979751 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.981525 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.981730 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.981569 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.982885 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.002764 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085314 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085376 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085403 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085439 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085462 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085496 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085601 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085626 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085687 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085754 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvw6d\" (UniqueName: \"kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085781 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.186575 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.186885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187019 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187163 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187277 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvw6d\" (UniqueName: \"kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187400 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187486 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187508 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187669 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187712 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187759 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187801 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187880 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187921 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188044 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188500 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188643 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188772 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188882 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188884 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.189322 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.191395 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.192648 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.210239 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvw6d\" (UniqueName: \"kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.294657 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.743409 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:31:25 crc kubenswrapper[4816]: I0316 00:31:25.410121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerStarted","Data":"bfc7e6a557fdaa90a0e926ad8cf423d7efa51c6fbbded9d245dc93674dc1ace2"} Mar 16 00:31:25 crc kubenswrapper[4816]: I0316 00:31:25.410510 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerStarted","Data":"0113d4b80ed0b8fe7937cde689e4ea8e705fc8e36ad867715f3142b9de604104"} Mar 16 00:31:26 crc kubenswrapper[4816]: I0316 00:31:26.417274 4816 generic.go:334] "Generic (PLEG): container finished" podID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerID="bfc7e6a557fdaa90a0e926ad8cf423d7efa51c6fbbded9d245dc93674dc1ace2" exitCode=0 Mar 16 00:31:26 crc kubenswrapper[4816]: I0316 00:31:26.417333 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerDied","Data":"bfc7e6a557fdaa90a0e926ad8cf423d7efa51c6fbbded9d245dc93674dc1ace2"} Mar 16 00:31:27 crc kubenswrapper[4816]: I0316 00:31:27.428383 4816 generic.go:334] "Generic (PLEG): container finished" podID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerID="e8de89caf63a1038e2ee8bd7df762a30c32b6504dafc4a5c1371cd611f17f793" exitCode=0 Mar 16 00:31:27 crc kubenswrapper[4816]: I0316 00:31:27.428463 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerDied","Data":"e8de89caf63a1038e2ee8bd7df762a30c32b6504dafc4a5c1371cd611f17f793"} Mar 16 00:31:27 crc kubenswrapper[4816]: I0316 00:31:27.470724 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_8ee83ac3-283d-44bb-8ad6-e78604301d3a/manage-dockerfile/0.log" Mar 16 00:31:28 crc kubenswrapper[4816]: I0316 00:31:28.442221 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerStarted","Data":"0902f30878c51e4f3236bd867ace5550ff04114b09d7f0926101a8f73cf8cc0d"} Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.127617 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=37.127597176 podStartE2EDuration="37.127597176s" podCreationTimestamp="2026-03-16 00:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:31:28.470767718 +0000 UTC m=+1481.567067701" watchObservedRunningTime="2026-03-16 00:32:00.127597176 +0000 UTC m=+1513.223897129" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.135287 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560352-4c2cj"] Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.136270 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.138220 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.140856 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-4c2cj"] Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.140978 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.142773 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.215672 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdzr7\" (UniqueName: \"kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7\") pod \"auto-csr-approver-29560352-4c2cj\" (UID: \"cee5a2cc-4256-43fb-9517-83533a5acf29\") " pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.317056 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdzr7\" (UniqueName: \"kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7\") pod \"auto-csr-approver-29560352-4c2cj\" (UID: \"cee5a2cc-4256-43fb-9517-83533a5acf29\") " pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.338700 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdzr7\" (UniqueName: \"kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7\") pod \"auto-csr-approver-29560352-4c2cj\" (UID: \"cee5a2cc-4256-43fb-9517-83533a5acf29\") " pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.460356 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.734240 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-4c2cj"] Mar 16 00:32:01 crc kubenswrapper[4816]: I0316 00:32:01.674363 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" event={"ID":"cee5a2cc-4256-43fb-9517-83533a5acf29","Type":"ContainerStarted","Data":"846e84f9793585d6acd707d0990c6f9bf7f849e2e38887a2322410f6a6e52271"} Mar 16 00:32:01 crc kubenswrapper[4816]: I0316 00:32:01.863584 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:32:01 crc kubenswrapper[4816]: I0316 00:32:01.863641 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:32:02 crc kubenswrapper[4816]: I0316 00:32:02.679477 4816 generic.go:334] "Generic (PLEG): container finished" podID="cee5a2cc-4256-43fb-9517-83533a5acf29" containerID="2169e8fca36c31b741a4793cc4a50c325f1ec3d6141a69fbe357f1c522080d5b" exitCode=0 Mar 16 00:32:02 crc kubenswrapper[4816]: I0316 00:32:02.679591 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" event={"ID":"cee5a2cc-4256-43fb-9517-83533a5acf29","Type":"ContainerDied","Data":"2169e8fca36c31b741a4793cc4a50c325f1ec3d6141a69fbe357f1c522080d5b"} Mar 16 00:32:03 crc kubenswrapper[4816]: I0316 00:32:03.923977 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.066798 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdzr7\" (UniqueName: \"kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7\") pod \"cee5a2cc-4256-43fb-9517-83533a5acf29\" (UID: \"cee5a2cc-4256-43fb-9517-83533a5acf29\") " Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.072981 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7" (OuterVolumeSpecName: "kube-api-access-qdzr7") pod "cee5a2cc-4256-43fb-9517-83533a5acf29" (UID: "cee5a2cc-4256-43fb-9517-83533a5acf29"). InnerVolumeSpecName "kube-api-access-qdzr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.168397 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdzr7\" (UniqueName: \"kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.694344 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" event={"ID":"cee5a2cc-4256-43fb-9517-83533a5acf29","Type":"ContainerDied","Data":"846e84f9793585d6acd707d0990c6f9bf7f849e2e38887a2322410f6a6e52271"} Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.694625 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="846e84f9793585d6acd707d0990c6f9bf7f849e2e38887a2322410f6a6e52271" Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.694422 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.985156 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-hjpvk"] Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.992429 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-hjpvk"] Mar 16 00:32:05 crc kubenswrapper[4816]: I0316 00:32:05.676432 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2942e78f-05b7-486f-bee0-93a942f80d8a" path="/var/lib/kubelet/pods/2942e78f-05b7-486f-bee0-93a942f80d8a/volumes" Mar 16 00:32:18 crc kubenswrapper[4816]: I0316 00:32:18.790825 4816 generic.go:334] "Generic (PLEG): container finished" podID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerID="0902f30878c51e4f3236bd867ace5550ff04114b09d7f0926101a8f73cf8cc0d" exitCode=0 Mar 16 00:32:18 crc kubenswrapper[4816]: I0316 00:32:18.790935 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerDied","Data":"0902f30878c51e4f3236bd867ace5550ff04114b09d7f0926101a8f73cf8cc0d"} Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.085603 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190067 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvw6d\" (UniqueName: \"kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190150 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190192 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190223 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190226 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190260 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190285 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190310 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190336 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190369 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190396 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190419 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190460 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190729 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190746 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190987 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.191355 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.192434 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.192532 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.195610 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.195891 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.196508 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d" (OuterVolumeSpecName: "kube-api-access-jvw6d") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "kube-api-access-jvw6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.202714 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.283328 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292133 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292170 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292180 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292190 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292200 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292209 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvw6d\" (UniqueName: \"kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292217 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292226 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292235 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.807838 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerDied","Data":"0113d4b80ed0b8fe7937cde689e4ea8e705fc8e36ad867715f3142b9de604104"} Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.808071 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0113d4b80ed0b8fe7937cde689e4ea8e705fc8e36ad867715f3142b9de604104" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.807910 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.013337 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:21 crc kubenswrapper[4816]: E0316 00:32:21.013862 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee5a2cc-4256-43fb-9517-83533a5acf29" containerName="oc" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.013902 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee5a2cc-4256-43fb-9517-83533a5acf29" containerName="oc" Mar 16 00:32:21 crc kubenswrapper[4816]: E0316 00:32:21.013929 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="manage-dockerfile" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.013944 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="manage-dockerfile" Mar 16 00:32:21 crc kubenswrapper[4816]: E0316 00:32:21.013971 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="docker-build" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.013985 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="docker-build" Mar 16 00:32:21 crc kubenswrapper[4816]: E0316 00:32:21.014007 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="git-clone" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.014020 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="git-clone" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.014246 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee5a2cc-4256-43fb-9517-83533a5acf29" containerName="oc" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.014282 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="docker-build" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.015901 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.025201 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.102909 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.102980 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.103012 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bmlt\" (UniqueName: \"kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.199995 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.204771 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.204817 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmlt\" (UniqueName: \"kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.204874 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.204919 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.205307 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.205371 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.226319 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmlt\" (UniqueName: \"kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.383387 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.834278 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:22 crc kubenswrapper[4816]: I0316 00:32:22.830319 4816 generic.go:334] "Generic (PLEG): container finished" podID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerID="25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6" exitCode=0 Mar 16 00:32:22 crc kubenswrapper[4816]: I0316 00:32:22.830405 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerDied","Data":"25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6"} Mar 16 00:32:22 crc kubenswrapper[4816]: I0316 00:32:22.830659 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerStarted","Data":"788e7b9c485e2da4d4c729d6018e362f67e9125881eaf7b3347cf2e96230957c"} Mar 16 00:32:23 crc kubenswrapper[4816]: I0316 00:32:23.839402 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerStarted","Data":"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8"} Mar 16 00:32:24 crc kubenswrapper[4816]: I0316 00:32:24.848138 4816 generic.go:334] "Generic (PLEG): container finished" podID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerID="b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8" exitCode=0 Mar 16 00:32:24 crc kubenswrapper[4816]: I0316 00:32:24.848196 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerDied","Data":"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8"} Mar 16 00:32:25 crc kubenswrapper[4816]: I0316 00:32:25.856671 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerStarted","Data":"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a"} Mar 16 00:32:25 crc kubenswrapper[4816]: I0316 00:32:25.881774 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tmfj2" podStartSLOduration=3.461727266 podStartE2EDuration="5.881749662s" podCreationTimestamp="2026-03-16 00:32:20 +0000 UTC" firstStartedPulling="2026-03-16 00:32:22.834850755 +0000 UTC m=+1535.931150728" lastFinishedPulling="2026-03-16 00:32:25.254873171 +0000 UTC m=+1538.351173124" observedRunningTime="2026-03-16 00:32:25.880195358 +0000 UTC m=+1538.976495351" watchObservedRunningTime="2026-03-16 00:32:25.881749662 +0000 UTC m=+1538.978049635" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.434454 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.435968 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.437476 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.437828 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.439441 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.439442 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.452733 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519722 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519771 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519803 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519855 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519917 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519941 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519956 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftwd\" (UniqueName: \"kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.520014 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.520061 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.520078 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.520110 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.520127 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.621960 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.622385 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.623055 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.623399 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.623859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624051 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vftwd\" (UniqueName: \"kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624202 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624310 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624393 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624691 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624836 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624874 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624737 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625135 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625223 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625328 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625540 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625705 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625814 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.626478 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.629658 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.638079 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.646163 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftwd\" (UniqueName: \"kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.756301 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:30 crc kubenswrapper[4816]: W0316 00:32:30.021235 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38d77c46_58bc_4dd3_a874_85e5b14c1585.slice/crio-eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb WatchSource:0}: Error finding container eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb: Status 404 returned error can't find the container with id eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb Mar 16 00:32:30 crc kubenswrapper[4816]: I0316 00:32:30.027166 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 16 00:32:30 crc kubenswrapper[4816]: I0316 00:32:30.903455 4816 generic.go:334] "Generic (PLEG): container finished" podID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerID="520f1908678615d0e5b73ebdbbe6a48ebe9c84b2afda6fc6f03c6d31b9a2fb39" exitCode=0 Mar 16 00:32:30 crc kubenswrapper[4816]: I0316 00:32:30.903580 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"38d77c46-58bc-4dd3-a874-85e5b14c1585","Type":"ContainerDied","Data":"520f1908678615d0e5b73ebdbbe6a48ebe9c84b2afda6fc6f03c6d31b9a2fb39"} Mar 16 00:32:30 crc kubenswrapper[4816]: I0316 00:32:30.903848 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"38d77c46-58bc-4dd3-a874-85e5b14c1585","Type":"ContainerStarted","Data":"eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb"} Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.384309 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.384381 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.863674 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.863768 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.918643 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"38d77c46-58bc-4dd3-a874-85e5b14c1585","Type":"ContainerStarted","Data":"d67550a0f7d36330df42139700602ca21750bc1c1f58bc1cc3a210d0f86409d3"} Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.945716 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-1-build" podStartSLOduration=2.945694067 podStartE2EDuration="2.945694067s" podCreationTimestamp="2026-03-16 00:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:32:31.943846666 +0000 UTC m=+1545.040146629" watchObservedRunningTime="2026-03-16 00:32:31.945694067 +0000 UTC m=+1545.041994020" Mar 16 00:32:32 crc kubenswrapper[4816]: I0316 00:32:32.431765 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tmfj2" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="registry-server" probeResult="failure" output=< Mar 16 00:32:32 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:32:32 crc kubenswrapper[4816]: > Mar 16 00:32:33 crc kubenswrapper[4816]: I0316 00:32:33.935365 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_38d77c46-58bc-4dd3-a874-85e5b14c1585/docker-build/0.log" Mar 16 00:32:33 crc kubenswrapper[4816]: I0316 00:32:33.937216 4816 generic.go:334] "Generic (PLEG): container finished" podID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerID="d67550a0f7d36330df42139700602ca21750bc1c1f58bc1cc3a210d0f86409d3" exitCode=1 Mar 16 00:32:33 crc kubenswrapper[4816]: I0316 00:32:33.937288 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"38d77c46-58bc-4dd3-a874-85e5b14c1585","Type":"ContainerDied","Data":"d67550a0f7d36330df42139700602ca21750bc1c1f58bc1cc3a210d0f86409d3"} Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.158910 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_38d77c46-58bc-4dd3-a874-85e5b14c1585/docker-build/0.log" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.159804 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301317 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301377 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301404 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301424 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301476 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vftwd\" (UniqueName: \"kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301510 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301568 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302050 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302128 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302191 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302190 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302252 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302293 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302318 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302500 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302598 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303253 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303364 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303387 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303378 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303400 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303442 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303463 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303481 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303627 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303976 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.308003 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.308080 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.309602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd" (OuterVolumeSpecName: "kube-api-access-vftwd") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "kube-api-access-vftwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405448 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405506 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405528 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405547 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405601 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405624 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vftwd\" (UniqueName: \"kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.952172 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_38d77c46-58bc-4dd3-a874-85e5b14c1585/docker-build/0.log" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.953172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"38d77c46-58bc-4dd3-a874-85e5b14c1585","Type":"ContainerDied","Data":"eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb"} Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.953220 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.953254 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:39 crc kubenswrapper[4816]: I0316 00:32:39.923715 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 16 00:32:39 crc kubenswrapper[4816]: I0316 00:32:39.931998 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.425649 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.470156 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.549753 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 16 00:32:41 crc kubenswrapper[4816]: E0316 00:32:41.550041 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerName="docker-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.550067 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerName="docker-build" Mar 16 00:32:41 crc kubenswrapper[4816]: E0316 00:32:41.550090 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerName="manage-dockerfile" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.550099 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerName="manage-dockerfile" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.550226 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerName="docker-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.551443 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.555285 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.555420 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.555310 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.555849 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.570229 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.663591 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.675775 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" path="/var/lib/kubelet/pods/38d77c46-58bc-4dd3-a874-85e5b14c1585/volumes" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688262 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688298 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688342 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688368 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688387 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688406 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688423 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688442 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqgb\" (UniqueName: \"kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688463 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688632 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688693 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789749 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789824 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789844 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789863 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789882 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789913 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789932 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789950 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789966 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789984 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nqgb\" (UniqueName: \"kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790004 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790030 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790151 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790458 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790495 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790817 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.791705 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.791757 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.791888 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.792023 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.794539 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.796204 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.810235 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nqgb\" (UniqueName: \"kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.866751 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:42 crc kubenswrapper[4816]: I0316 00:32:42.047776 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.003692 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerStarted","Data":"0366a024e1fb0871713f897d2b15717fdfe061cb2143d61741b0493ff4eab489"} Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.004431 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerStarted","Data":"4e56eb9def807a83219554ca3506a988f161f2353f6b9de86cc0c12cc3ff1192"} Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.003779 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tmfj2" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="registry-server" containerID="cri-o://ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a" gracePeriod=2 Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.370248 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.526951 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities\") pod \"05447e0c-cfec-4548-a367-b4058cd9ee40\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.527086 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content\") pod \"05447e0c-cfec-4548-a367-b4058cd9ee40\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.527206 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bmlt\" (UniqueName: \"kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt\") pod \"05447e0c-cfec-4548-a367-b4058cd9ee40\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.528108 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities" (OuterVolumeSpecName: "utilities") pod "05447e0c-cfec-4548-a367-b4058cd9ee40" (UID: "05447e0c-cfec-4548-a367-b4058cd9ee40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.532975 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt" (OuterVolumeSpecName: "kube-api-access-9bmlt") pod "05447e0c-cfec-4548-a367-b4058cd9ee40" (UID: "05447e0c-cfec-4548-a367-b4058cd9ee40"). InnerVolumeSpecName "kube-api-access-9bmlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.629099 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.629143 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bmlt\" (UniqueName: \"kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.722343 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05447e0c-cfec-4548-a367-b4058cd9ee40" (UID: "05447e0c-cfec-4548-a367-b4058cd9ee40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.730187 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.012450 4816 generic.go:334] "Generic (PLEG): container finished" podID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerID="ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a" exitCode=0 Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.012535 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.012566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerDied","Data":"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a"} Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.013014 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerDied","Data":"788e7b9c485e2da4d4c729d6018e362f67e9125881eaf7b3347cf2e96230957c"} Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.013032 4816 scope.go:117] "RemoveContainer" containerID="ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.014526 4816 generic.go:334] "Generic (PLEG): container finished" podID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerID="0366a024e1fb0871713f897d2b15717fdfe061cb2143d61741b0493ff4eab489" exitCode=0 Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.014572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerDied","Data":"0366a024e1fb0871713f897d2b15717fdfe061cb2143d61741b0493ff4eab489"} Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.032739 4816 scope.go:117] "RemoveContainer" containerID="b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.075638 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.077978 4816 scope.go:117] "RemoveContainer" containerID="25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.080763 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.104293 4816 scope.go:117] "RemoveContainer" containerID="ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a" Mar 16 00:32:44 crc kubenswrapper[4816]: E0316 00:32:44.104830 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a\": container with ID starting with ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a not found: ID does not exist" containerID="ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.104883 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a"} err="failed to get container status \"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a\": rpc error: code = NotFound desc = could not find container \"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a\": container with ID starting with ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a not found: ID does not exist" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.104909 4816 scope.go:117] "RemoveContainer" containerID="b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8" Mar 16 00:32:44 crc kubenswrapper[4816]: E0316 00:32:44.105244 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8\": container with ID starting with b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8 not found: ID does not exist" containerID="b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.105279 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8"} err="failed to get container status \"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8\": rpc error: code = NotFound desc = could not find container \"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8\": container with ID starting with b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8 not found: ID does not exist" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.105304 4816 scope.go:117] "RemoveContainer" containerID="25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6" Mar 16 00:32:44 crc kubenswrapper[4816]: E0316 00:32:44.105579 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6\": container with ID starting with 25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6 not found: ID does not exist" containerID="25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.105617 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6"} err="failed to get container status \"25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6\": rpc error: code = NotFound desc = could not find container \"25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6\": container with ID starting with 25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6 not found: ID does not exist" Mar 16 00:32:45 crc kubenswrapper[4816]: I0316 00:32:45.023116 4816 generic.go:334] "Generic (PLEG): container finished" podID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerID="d77eca03b254d697004fd4427a5501efd30125dde6d6f8186d5997a043ed31c6" exitCode=0 Mar 16 00:32:45 crc kubenswrapper[4816]: I0316 00:32:45.023178 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerDied","Data":"d77eca03b254d697004fd4427a5501efd30125dde6d6f8186d5997a043ed31c6"} Mar 16 00:32:45 crc kubenswrapper[4816]: I0316 00:32:45.067154 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_4b2b3bef-a66e-4caa-bf69-164562b1dfd6/manage-dockerfile/0.log" Mar 16 00:32:45 crc kubenswrapper[4816]: I0316 00:32:45.680334 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" path="/var/lib/kubelet/pods/05447e0c-cfec-4548-a367-b4058cd9ee40/volumes" Mar 16 00:32:46 crc kubenswrapper[4816]: I0316 00:32:46.033205 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerStarted","Data":"9220ae7cef81439d24ea0bf56c09c98a6723b9dba8abd218142c8db7d16c74ff"} Mar 16 00:32:46 crc kubenswrapper[4816]: I0316 00:32:46.059707 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=5.059683912 podStartE2EDuration="5.059683912s" podCreationTimestamp="2026-03-16 00:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:32:46.05636657 +0000 UTC m=+1559.152666523" watchObservedRunningTime="2026-03-16 00:32:46.059683912 +0000 UTC m=+1559.155983865" Mar 16 00:32:49 crc kubenswrapper[4816]: I0316 00:32:49.054089 4816 generic.go:334] "Generic (PLEG): container finished" podID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerID="9220ae7cef81439d24ea0bf56c09c98a6723b9dba8abd218142c8db7d16c74ff" exitCode=0 Mar 16 00:32:49 crc kubenswrapper[4816]: I0316 00:32:49.054187 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerDied","Data":"9220ae7cef81439d24ea0bf56c09c98a6723b9dba8abd218142c8db7d16c74ff"} Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.278515 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421251 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421322 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421369 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421407 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421435 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421506 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421541 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nqgb\" (UniqueName: \"kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421600 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421651 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421675 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421708 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421750 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421879 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421943 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422231 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422410 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422431 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422442 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422580 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422622 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422884 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422943 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.426038 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.428101 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.428117 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.429106 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb" (OuterVolumeSpecName: "kube-api-access-4nqgb") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "kube-api-access-4nqgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.429819 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523116 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523148 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523157 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523168 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523176 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nqgb\" (UniqueName: \"kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523184 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523192 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523235 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523244 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:51 crc kubenswrapper[4816]: I0316 00:32:51.071604 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerDied","Data":"4e56eb9def807a83219554ca3506a988f161f2353f6b9de86cc0c12cc3ff1192"} Mar 16 00:32:51 crc kubenswrapper[4816]: I0316 00:32:51.071829 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e56eb9def807a83219554ca3506a988f161f2353f6b9de86cc0c12cc3ff1192" Mar 16 00:32:51 crc kubenswrapper[4816]: I0316 00:32:51.072232 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.231027 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232828 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="git-clone" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232866 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="git-clone" Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232877 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="registry-server" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232883 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="registry-server" Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232895 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="extract-utilities" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232902 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="extract-utilities" Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232911 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="docker-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232919 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="docker-build" Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232927 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="manage-dockerfile" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232934 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="manage-dockerfile" Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232946 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="extract-content" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232952 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="extract-content" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.233065 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="docker-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.233084 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="registry-server" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.233738 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.236006 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.236365 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.236810 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.236979 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.242310 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.371858 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372010 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372173 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372215 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372260 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372350 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfr8f\" (UniqueName: \"kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372404 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372490 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372515 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372609 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372713 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.473903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.473955 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.473994 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474024 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474053 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474094 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfr8f\" (UniqueName: \"kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474124 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474159 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474182 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474214 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474258 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474292 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474424 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474535 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474602 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474825 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474943 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.475738 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.475763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.475757 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.476152 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.481103 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.485943 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.491403 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfr8f\" (UniqueName: \"kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.549336 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.750986 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 16 00:32:55 crc kubenswrapper[4816]: I0316 00:32:55.109617 4816 generic.go:334] "Generic (PLEG): container finished" podID="7c03433b-4b96-4172-b344-de3e72b52900" containerID="e30d5325e2ed5b1448d06f7c2b9b149b118aecd27db13855743e289187e36f13" exitCode=0 Mar 16 00:32:55 crc kubenswrapper[4816]: I0316 00:32:55.109670 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"7c03433b-4b96-4172-b344-de3e72b52900","Type":"ContainerDied","Data":"e30d5325e2ed5b1448d06f7c2b9b149b118aecd27db13855743e289187e36f13"} Mar 16 00:32:55 crc kubenswrapper[4816]: I0316 00:32:55.109699 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"7c03433b-4b96-4172-b344-de3e72b52900","Type":"ContainerStarted","Data":"68913983495e9d0f1ddcf3da8b311626b33a5d1edfe1be51675b53ba41003d13"} Mar 16 00:32:56 crc kubenswrapper[4816]: I0316 00:32:56.122198 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_7c03433b-4b96-4172-b344-de3e72b52900/docker-build/0.log" Mar 16 00:32:56 crc kubenswrapper[4816]: I0316 00:32:56.123072 4816 generic.go:334] "Generic (PLEG): container finished" podID="7c03433b-4b96-4172-b344-de3e72b52900" containerID="e4439cd35f13a68a04a6b45eaa00f3aeec10afe4bbea233d056d60648e32f1e4" exitCode=1 Mar 16 00:32:56 crc kubenswrapper[4816]: I0316 00:32:56.123118 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"7c03433b-4b96-4172-b344-de3e72b52900","Type":"ContainerDied","Data":"e4439cd35f13a68a04a6b45eaa00f3aeec10afe4bbea233d056d60648e32f1e4"} Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.352915 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_7c03433b-4b96-4172-b344-de3e72b52900/docker-build/0.log" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.353726 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469112 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469199 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfr8f\" (UniqueName: \"kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469273 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469327 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469379 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469408 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469436 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469496 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469530 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469592 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469650 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469699 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469730 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469760 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.470004 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.470019 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.470735 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.471116 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.471248 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.472739 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.473166 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.473448 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.473931 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.477960 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f" (OuterVolumeSpecName: "kube-api-access-dfr8f") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "kube-api-access-dfr8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.477938 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.478226 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571428 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571471 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571486 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571502 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571514 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571527 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571539 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571573 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfr8f\" (UniqueName: \"kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571587 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571599 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:58 crc kubenswrapper[4816]: I0316 00:32:58.140807 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_7c03433b-4b96-4172-b344-de3e72b52900/docker-build/0.log" Mar 16 00:32:58 crc kubenswrapper[4816]: I0316 00:32:58.141132 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"7c03433b-4b96-4172-b344-de3e72b52900","Type":"ContainerDied","Data":"68913983495e9d0f1ddcf3da8b311626b33a5d1edfe1be51675b53ba41003d13"} Mar 16 00:32:58 crc kubenswrapper[4816]: I0316 00:32:58.141165 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68913983495e9d0f1ddcf3da8b311626b33a5d1edfe1be51675b53ba41003d13" Mar 16 00:32:58 crc kubenswrapper[4816]: I0316 00:32:58.141201 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:33:01 crc kubenswrapper[4816]: I0316 00:33:01.863125 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:33:01 crc kubenswrapper[4816]: I0316 00:33:01.863472 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:33:01 crc kubenswrapper[4816]: I0316 00:33:01.863521 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:33:01 crc kubenswrapper[4816]: I0316 00:33:01.864288 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:33:01 crc kubenswrapper[4816]: I0316 00:33:01.864369 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071" gracePeriod=600 Mar 16 00:33:02 crc kubenswrapper[4816]: I0316 00:33:02.167396 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071" exitCode=0 Mar 16 00:33:02 crc kubenswrapper[4816]: I0316 00:33:02.167474 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071"} Mar 16 00:33:02 crc kubenswrapper[4816]: I0316 00:33:02.167737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a"} Mar 16 00:33:02 crc kubenswrapper[4816]: I0316 00:33:02.167757 4816 scope.go:117] "RemoveContainer" containerID="4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc" Mar 16 00:33:04 crc kubenswrapper[4816]: I0316 00:33:04.442570 4816 scope.go:117] "RemoveContainer" containerID="0b8b2a24c4f32aff091a974cc84de6242e724aacb4bfa1cc19578627d86a25d5" Mar 16 00:33:04 crc kubenswrapper[4816]: I0316 00:33:04.724595 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 16 00:33:04 crc kubenswrapper[4816]: I0316 00:33:04.732641 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 16 00:33:05 crc kubenswrapper[4816]: I0316 00:33:05.676537 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c03433b-4b96-4172-b344-de3e72b52900" path="/var/lib/kubelet/pods/7c03433b-4b96-4172-b344-de3e72b52900/volumes" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.708831 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 16 00:33:06 crc kubenswrapper[4816]: E0316 00:33:06.709137 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c03433b-4b96-4172-b344-de3e72b52900" containerName="docker-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.709152 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c03433b-4b96-4172-b344-de3e72b52900" containerName="docker-build" Mar 16 00:33:06 crc kubenswrapper[4816]: E0316 00:33:06.709169 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c03433b-4b96-4172-b344-de3e72b52900" containerName="manage-dockerfile" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.709177 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c03433b-4b96-4172-b344-de3e72b52900" containerName="manage-dockerfile" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.709332 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c03433b-4b96-4172-b344-de3e72b52900" containerName="docker-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.710447 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.716327 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.716788 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.716949 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.717074 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.724449 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.907508 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.907745 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.907860 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.907934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908023 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908124 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908241 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908377 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908469 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908576 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908981 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlvd\" (UniqueName: \"kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.909159 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.010926 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.010978 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.010996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011013 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011041 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011067 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011095 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011113 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011128 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011142 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlvd\" (UniqueName: \"kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011214 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011220 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011667 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011679 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011928 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011988 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.012103 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.012107 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.012193 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.012341 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.017056 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.017326 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.029214 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlvd\" (UniqueName: \"kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.071967 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.274793 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 16 00:33:08 crc kubenswrapper[4816]: I0316 00:33:08.215121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerStarted","Data":"72a8749a1dbd2bdcde0b4e49f8bc79349f100d9e298284d8ed3050dbf4e9a676"} Mar 16 00:33:08 crc kubenswrapper[4816]: I0316 00:33:08.215181 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerStarted","Data":"b6c00ab1c06d051f1145bfff23f81747ff821693325bc2d3f63912c28c948e1c"} Mar 16 00:33:09 crc kubenswrapper[4816]: I0316 00:33:09.223907 4816 generic.go:334] "Generic (PLEG): container finished" podID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerID="72a8749a1dbd2bdcde0b4e49f8bc79349f100d9e298284d8ed3050dbf4e9a676" exitCode=0 Mar 16 00:33:09 crc kubenswrapper[4816]: I0316 00:33:09.224003 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerDied","Data":"72a8749a1dbd2bdcde0b4e49f8bc79349f100d9e298284d8ed3050dbf4e9a676"} Mar 16 00:33:10 crc kubenswrapper[4816]: I0316 00:33:10.232506 4816 generic.go:334] "Generic (PLEG): container finished" podID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerID="6b82c27233585220fc0a8e3b4927009ca2ab8e3e309563a177d52d0653f2ae10" exitCode=0 Mar 16 00:33:10 crc kubenswrapper[4816]: I0316 00:33:10.232596 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerDied","Data":"6b82c27233585220fc0a8e3b4927009ca2ab8e3e309563a177d52d0653f2ae10"} Mar 16 00:33:10 crc kubenswrapper[4816]: I0316 00:33:10.288888 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_4ac9b18d-8362-488c-a816-c85899c4aa6e/manage-dockerfile/0.log" Mar 16 00:33:11 crc kubenswrapper[4816]: I0316 00:33:11.240621 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerStarted","Data":"514ea21f1c2bd0b5e2d6897180f4ed6f68308cbb5552dadf3beb9f36f0ab1f92"} Mar 16 00:33:11 crc kubenswrapper[4816]: I0316 00:33:11.267982 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.267966398 podStartE2EDuration="5.267966398s" podCreationTimestamp="2026-03-16 00:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:33:11.265622613 +0000 UTC m=+1584.361922566" watchObservedRunningTime="2026-03-16 00:33:11.267966398 +0000 UTC m=+1584.364266351" Mar 16 00:33:14 crc kubenswrapper[4816]: I0316 00:33:14.257907 4816 generic.go:334] "Generic (PLEG): container finished" podID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerID="514ea21f1c2bd0b5e2d6897180f4ed6f68308cbb5552dadf3beb9f36f0ab1f92" exitCode=0 Mar 16 00:33:14 crc kubenswrapper[4816]: I0316 00:33:14.258016 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerDied","Data":"514ea21f1c2bd0b5e2d6897180f4ed6f68308cbb5552dadf3beb9f36f0ab1f92"} Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.557938 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.717867 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.717966 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718048 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718075 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718096 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718124 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718155 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjlvd\" (UniqueName: \"kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718181 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718211 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718245 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718270 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718289 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718999 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.719071 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.719395 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.719634 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718681 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.720038 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.719704 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.721590 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.724896 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.725486 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd" (OuterVolumeSpecName: "kube-api-access-xjlvd") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "kube-api-access-xjlvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.726161 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.729741 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820073 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820123 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820137 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820148 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820159 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820169 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820181 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820190 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820201 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820210 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820222 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjlvd\" (UniqueName: \"kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820233 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:16 crc kubenswrapper[4816]: I0316 00:33:16.273573 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerDied","Data":"b6c00ab1c06d051f1145bfff23f81747ff821693325bc2d3f63912c28c948e1c"} Mar 16 00:33:16 crc kubenswrapper[4816]: I0316 00:33:16.273611 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6c00ab1c06d051f1145bfff23f81747ff821693325bc2d3f63912c28c948e1c" Mar 16 00:33:16 crc kubenswrapper[4816]: I0316 00:33:16.273657 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.200291 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 16 00:33:32 crc kubenswrapper[4816]: E0316 00:33:32.200984 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="git-clone" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.200998 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="git-clone" Mar 16 00:33:32 crc kubenswrapper[4816]: E0316 00:33:32.201008 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="docker-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.201014 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="docker-build" Mar 16 00:33:32 crc kubenswrapper[4816]: E0316 00:33:32.201024 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="manage-dockerfile" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.201030 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="manage-dockerfile" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.201145 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="docker-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.201919 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.205310 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.206589 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.206804 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.207030 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.207153 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.227116 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.325834 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.325992 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326031 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326108 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326132 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326153 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326211 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326254 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326281 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326314 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326377 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326446 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326504 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rhdb\" (UniqueName: \"kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.427837 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.427887 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.427928 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.427961 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.427972 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428054 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428126 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428167 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428196 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428255 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428306 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rhdb\" (UniqueName: \"kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428350 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428698 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428827 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428848 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428917 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428990 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.429114 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.429615 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.429652 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.433633 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.434225 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.434420 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.446241 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rhdb\" (UniqueName: \"kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.531097 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.722539 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 16 00:33:33 crc kubenswrapper[4816]: I0316 00:33:33.396001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerStarted","Data":"4ba83f4162fdd45ac10eb6399a78b3388e8086607eb25657ce3203eaaf7bff54"} Mar 16 00:33:33 crc kubenswrapper[4816]: I0316 00:33:33.396311 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerStarted","Data":"2c2104821392a6c5290ae6c58a4c84c015279512137a840ee8488010d57afbe2"} Mar 16 00:33:34 crc kubenswrapper[4816]: I0316 00:33:34.407900 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerID="4ba83f4162fdd45ac10eb6399a78b3388e8086607eb25657ce3203eaaf7bff54" exitCode=0 Mar 16 00:33:34 crc kubenswrapper[4816]: I0316 00:33:34.408042 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerDied","Data":"4ba83f4162fdd45ac10eb6399a78b3388e8086607eb25657ce3203eaaf7bff54"} Mar 16 00:33:35 crc kubenswrapper[4816]: I0316 00:33:35.415463 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerID="c3ca84acff081714bca19f133df479a0c9f1aeb453b6556bb680a18c786e81f9" exitCode=0 Mar 16 00:33:35 crc kubenswrapper[4816]: I0316 00:33:35.415520 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerDied","Data":"c3ca84acff081714bca19f133df479a0c9f1aeb453b6556bb680a18c786e81f9"} Mar 16 00:33:35 crc kubenswrapper[4816]: I0316 00:33:35.451416 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_ff71ddfd-c6da-40e7-ac26-e7178e364679/manage-dockerfile/0.log" Mar 16 00:33:36 crc kubenswrapper[4816]: I0316 00:33:36.424786 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerStarted","Data":"8b7a0d2de6d39ce58c695e7862c0bf7b6723767d296869ab9afafe7273667037"} Mar 16 00:33:36 crc kubenswrapper[4816]: I0316 00:33:36.452571 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=4.452530745 podStartE2EDuration="4.452530745s" podCreationTimestamp="2026-03-16 00:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:33:36.449059229 +0000 UTC m=+1609.545359192" watchObservedRunningTime="2026-03-16 00:33:36.452530745 +0000 UTC m=+1609.548830718" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.136999 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560354-nmflm"] Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.138417 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.140443 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.141884 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.144320 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.149435 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-nmflm"] Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.206523 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5ct9\" (UniqueName: \"kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9\") pod \"auto-csr-approver-29560354-nmflm\" (UID: \"4786aa78-4870-43d7-a324-e3e3dd2c7943\") " pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.307532 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5ct9\" (UniqueName: \"kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9\") pod \"auto-csr-approver-29560354-nmflm\" (UID: \"4786aa78-4870-43d7-a324-e3e3dd2c7943\") " pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.329900 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5ct9\" (UniqueName: \"kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9\") pod \"auto-csr-approver-29560354-nmflm\" (UID: \"4786aa78-4870-43d7-a324-e3e3dd2c7943\") " pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.484475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.900239 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-nmflm"] Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.911095 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:34:01 crc kubenswrapper[4816]: I0316 00:34:01.613335 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-nmflm" event={"ID":"4786aa78-4870-43d7-a324-e3e3dd2c7943","Type":"ContainerStarted","Data":"86207657188b34b5bd103fcfbc19aacf027614812a3717aef241d2ae2885907a"} Mar 16 00:34:06 crc kubenswrapper[4816]: I0316 00:34:06.648294 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerID="8b7a0d2de6d39ce58c695e7862c0bf7b6723767d296869ab9afafe7273667037" exitCode=0 Mar 16 00:34:06 crc kubenswrapper[4816]: I0316 00:34:06.648394 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerDied","Data":"8b7a0d2de6d39ce58c695e7862c0bf7b6723767d296869ab9afafe7273667037"} Mar 16 00:34:06 crc kubenswrapper[4816]: I0316 00:34:06.651724 4816 generic.go:334] "Generic (PLEG): container finished" podID="4786aa78-4870-43d7-a324-e3e3dd2c7943" containerID="0c26c66eab197680871c2539e7ed1477694cb8e32e0bc0cdad1221a9720899f7" exitCode=0 Mar 16 00:34:06 crc kubenswrapper[4816]: I0316 00:34:06.651789 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-nmflm" event={"ID":"4786aa78-4870-43d7-a324-e3e3dd2c7943","Type":"ContainerDied","Data":"0c26c66eab197680871c2539e7ed1477694cb8e32e0bc0cdad1221a9720899f7"} Mar 16 00:34:07 crc kubenswrapper[4816]: I0316 00:34:07.950998 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:07 crc kubenswrapper[4816]: I0316 00:34:07.959661 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114151 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rhdb\" (UniqueName: \"kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114790 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114889 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114923 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114947 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114995 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115035 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115059 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115091 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115140 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115185 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115213 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115240 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5ct9\" (UniqueName: \"kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9\") pod \"4786aa78-4870-43d7-a324-e3e3dd2c7943\" (UID: \"4786aa78-4870-43d7-a324-e3e3dd2c7943\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115628 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115995 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.116119 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.116353 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.116405 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.116711 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.117296 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.119671 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.119800 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.120437 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.120711 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb" (OuterVolumeSpecName: "kube-api-access-9rhdb") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "kube-api-access-9rhdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.121299 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9" (OuterVolumeSpecName: "kube-api-access-c5ct9") pod "4786aa78-4870-43d7-a324-e3e3dd2c7943" (UID: "4786aa78-4870-43d7-a324-e3e3dd2c7943"). InnerVolumeSpecName "kube-api-access-c5ct9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217346 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217391 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217408 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217420 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217432 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217445 4816 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217460 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217471 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217483 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5ct9\" (UniqueName: \"kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217494 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rhdb\" (UniqueName: \"kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217504 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217515 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.657114 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.667234 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-nmflm" event={"ID":"4786aa78-4870-43d7-a324-e3e3dd2c7943","Type":"ContainerDied","Data":"86207657188b34b5bd103fcfbc19aacf027614812a3717aef241d2ae2885907a"} Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.667256 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.667267 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86207657188b34b5bd103fcfbc19aacf027614812a3717aef241d2ae2885907a" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.670617 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerDied","Data":"2c2104821392a6c5290ae6c58a4c84c015279512137a840ee8488010d57afbe2"} Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.670670 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c2104821392a6c5290ae6c58a4c84c015279512137a840ee8488010d57afbe2" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.670768 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.723434 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:09 crc kubenswrapper[4816]: I0316 00:34:09.004607 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-xvv6w"] Mar 16 00:34:09 crc kubenswrapper[4816]: I0316 00:34:09.009099 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-xvv6w"] Mar 16 00:34:09 crc kubenswrapper[4816]: I0316 00:34:09.404669 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:09 crc kubenswrapper[4816]: I0316 00:34:09.433237 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:09 crc kubenswrapper[4816]: I0316 00:34:09.681216 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a529fd1f-66e5-4e49-b95a-18c6a8aade4b" path="/var/lib/kubelet/pods/a529fd1f-66e5-4e49-b95a-18c6a8aade4b/volumes" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.030929 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:10 crc kubenswrapper[4816]: E0316 00:34:10.031254 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4786aa78-4870-43d7-a324-e3e3dd2c7943" containerName="oc" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031269 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4786aa78-4870-43d7-a324-e3e3dd2c7943" containerName="oc" Mar 16 00:34:10 crc kubenswrapper[4816]: E0316 00:34:10.031284 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="manage-dockerfile" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031292 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="manage-dockerfile" Mar 16 00:34:10 crc kubenswrapper[4816]: E0316 00:34:10.031308 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="git-clone" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031315 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="git-clone" Mar 16 00:34:10 crc kubenswrapper[4816]: E0316 00:34:10.031327 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="docker-build" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031335 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="docker-build" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031482 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="docker-build" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031495 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4786aa78-4870-43d7-a324-e3e3dd2c7943" containerName="oc" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.032058 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.036775 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.036889 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-m64wm" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.142216 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6skvx\" (UniqueName: \"kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx\") pod \"infrawatch-operators-8g6sv\" (UID: \"4abfb758-2aed-48a9-ab16-b7564942a72f\") " pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.243668 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6skvx\" (UniqueName: \"kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx\") pod \"infrawatch-operators-8g6sv\" (UID: \"4abfb758-2aed-48a9-ab16-b7564942a72f\") " pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.268526 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6skvx\" (UniqueName: \"kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx\") pod \"infrawatch-operators-8g6sv\" (UID: \"4abfb758-2aed-48a9-ab16-b7564942a72f\") " pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.345821 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.532856 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.689273 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-8g6sv" event={"ID":"4abfb758-2aed-48a9-ab16-b7564942a72f","Type":"ContainerStarted","Data":"c0efc16eea6c4c5b32f050536aed0e0e924ed1b0fd0dfdea31edabd998b22a17"} Mar 16 00:34:14 crc kubenswrapper[4816]: I0316 00:34:14.825901 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.626927 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-fm45p"] Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.629127 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.638333 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-fm45p"] Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.818180 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvf2k\" (UniqueName: \"kubernetes.io/projected/ffeac517-cf5e-4a11-898c-8bee3a6e9ee3-kube-api-access-bvf2k\") pod \"infrawatch-operators-fm45p\" (UID: \"ffeac517-cf5e-4a11-898c-8bee3a6e9ee3\") " pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.919728 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvf2k\" (UniqueName: \"kubernetes.io/projected/ffeac517-cf5e-4a11-898c-8bee3a6e9ee3-kube-api-access-bvf2k\") pod \"infrawatch-operators-fm45p\" (UID: \"ffeac517-cf5e-4a11-898c-8bee3a6e9ee3\") " pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.945655 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvf2k\" (UniqueName: \"kubernetes.io/projected/ffeac517-cf5e-4a11-898c-8bee3a6e9ee3-kube-api-access-bvf2k\") pod \"infrawatch-operators-fm45p\" (UID: \"ffeac517-cf5e-4a11-898c-8bee3a6e9ee3\") " pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.975113 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:20 crc kubenswrapper[4816]: I0316 00:34:20.957199 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-fm45p"] Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.777916 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fm45p" event={"ID":"ffeac517-cf5e-4a11-898c-8bee3a6e9ee3","Type":"ContainerStarted","Data":"6cf2f19e9690d3f84b8debbcca8669acb16b57dfcfbb47cd7268142a254e8a78"} Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.778722 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fm45p" event={"ID":"ffeac517-cf5e-4a11-898c-8bee3a6e9ee3","Type":"ContainerStarted","Data":"227449b363bbf808a717b95b92e525ea3093476cd0c46bd8fa099fd2019d3718"} Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.780042 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-8g6sv" event={"ID":"4abfb758-2aed-48a9-ab16-b7564942a72f","Type":"ContainerStarted","Data":"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab"} Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.780158 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-8g6sv" podUID="4abfb758-2aed-48a9-ab16-b7564942a72f" containerName="registry-server" containerID="cri-o://3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab" gracePeriod=2 Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.797660 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-fm45p" podStartSLOduration=6.642028191 podStartE2EDuration="6.797634904s" podCreationTimestamp="2026-03-16 00:34:15 +0000 UTC" firstStartedPulling="2026-03-16 00:34:20.995680519 +0000 UTC m=+1654.091980472" lastFinishedPulling="2026-03-16 00:34:21.151287232 +0000 UTC m=+1654.247587185" observedRunningTime="2026-03-16 00:34:21.792219544 +0000 UTC m=+1654.888519497" watchObservedRunningTime="2026-03-16 00:34:21.797634904 +0000 UTC m=+1654.893934857" Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.813694 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-8g6sv" podStartSLOduration=1.373134268 podStartE2EDuration="11.813667088s" podCreationTimestamp="2026-03-16 00:34:10 +0000 UTC" firstStartedPulling="2026-03-16 00:34:10.541032878 +0000 UTC m=+1643.637332831" lastFinishedPulling="2026-03-16 00:34:20.981565698 +0000 UTC m=+1654.077865651" observedRunningTime="2026-03-16 00:34:21.81229881 +0000 UTC m=+1654.908598773" watchObservedRunningTime="2026-03-16 00:34:21.813667088 +0000 UTC m=+1654.909967041" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.200676 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.204981 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6skvx\" (UniqueName: \"kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx\") pod \"4abfb758-2aed-48a9-ab16-b7564942a72f\" (UID: \"4abfb758-2aed-48a9-ab16-b7564942a72f\") " Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.213890 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx" (OuterVolumeSpecName: "kube-api-access-6skvx") pod "4abfb758-2aed-48a9-ab16-b7564942a72f" (UID: "4abfb758-2aed-48a9-ab16-b7564942a72f"). InnerVolumeSpecName "kube-api-access-6skvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.306366 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6skvx\" (UniqueName: \"kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.786763 4816 generic.go:334] "Generic (PLEG): container finished" podID="4abfb758-2aed-48a9-ab16-b7564942a72f" containerID="3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab" exitCode=0 Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.786832 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.786885 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-8g6sv" event={"ID":"4abfb758-2aed-48a9-ab16-b7564942a72f","Type":"ContainerDied","Data":"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab"} Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.786927 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-8g6sv" event={"ID":"4abfb758-2aed-48a9-ab16-b7564942a72f","Type":"ContainerDied","Data":"c0efc16eea6c4c5b32f050536aed0e0e924ed1b0fd0dfdea31edabd998b22a17"} Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.786949 4816 scope.go:117] "RemoveContainer" containerID="3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.811236 4816 scope.go:117] "RemoveContainer" containerID="3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab" Mar 16 00:34:22 crc kubenswrapper[4816]: E0316 00:34:22.811689 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab\": container with ID starting with 3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab not found: ID does not exist" containerID="3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.811730 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab"} err="failed to get container status \"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab\": rpc error: code = NotFound desc = could not find container \"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab\": container with ID starting with 3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab not found: ID does not exist" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.812984 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.819371 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:23 crc kubenswrapper[4816]: I0316 00:34:23.678455 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abfb758-2aed-48a9-ab16-b7564942a72f" path="/var/lib/kubelet/pods/4abfb758-2aed-48a9-ab16-b7564942a72f/volumes" Mar 16 00:34:25 crc kubenswrapper[4816]: I0316 00:34:25.976115 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:25 crc kubenswrapper[4816]: I0316 00:34:25.976211 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:26 crc kubenswrapper[4816]: I0316 00:34:26.001406 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:26 crc kubenswrapper[4816]: I0316 00:34:26.848109 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:31 crc kubenswrapper[4816]: I0316 00:34:31.885838 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth"] Mar 16 00:34:31 crc kubenswrapper[4816]: E0316 00:34:31.886730 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abfb758-2aed-48a9-ab16-b7564942a72f" containerName="registry-server" Mar 16 00:34:31 crc kubenswrapper[4816]: I0316 00:34:31.886752 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abfb758-2aed-48a9-ab16-b7564942a72f" containerName="registry-server" Mar 16 00:34:31 crc kubenswrapper[4816]: I0316 00:34:31.886947 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abfb758-2aed-48a9-ab16-b7564942a72f" containerName="registry-server" Mar 16 00:34:31 crc kubenswrapper[4816]: I0316 00:34:31.888832 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:31 crc kubenswrapper[4816]: I0316 00:34:31.897125 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth"] Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.023927 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.023990 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzgc\" (UniqueName: \"kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.024065 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.124976 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.125078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.125101 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzgc\" (UniqueName: \"kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.126063 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.126366 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.142063 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzgc\" (UniqueName: \"kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.208805 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.627518 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth"] Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.854162 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerStarted","Data":"6b420a3589052a6db1afcf2db384d95e64c97ce42dcee5c381e8a7882749e59b"} Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.854214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerStarted","Data":"d790c89c5f398f362a0ba954ed74496a86b4d207075a97a2dba1f9ba8fe354b1"} Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.919113 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz"] Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.921168 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.932533 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz"] Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.934008 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.934064 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.934106 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf99h\" (UniqueName: \"kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.035173 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.035275 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.035319 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf99h\" (UniqueName: \"kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.035842 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.036028 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.053842 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf99h\" (UniqueName: \"kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.301122 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.788120 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz"] Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.866681 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" event={"ID":"f6754fbe-ac20-4fe6-8c87-6d30f20069b9","Type":"ContainerStarted","Data":"add5543c2b87b62fcb6cc941f4e9d34042291c66893841625339e4cb436c7477"} Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.867961 4816 generic.go:334] "Generic (PLEG): container finished" podID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerID="6b420a3589052a6db1afcf2db384d95e64c97ce42dcee5c381e8a7882749e59b" exitCode=0 Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.868001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerDied","Data":"6b420a3589052a6db1afcf2db384d95e64c97ce42dcee5c381e8a7882749e59b"} Mar 16 00:34:34 crc kubenswrapper[4816]: I0316 00:34:34.876774 4816 generic.go:334] "Generic (PLEG): container finished" podID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerID="18a1d999c028a245479511118fd18d5623e31bf0ba1ed36a8923351ebd56c713" exitCode=0 Mar 16 00:34:34 crc kubenswrapper[4816]: I0316 00:34:34.876867 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" event={"ID":"f6754fbe-ac20-4fe6-8c87-6d30f20069b9","Type":"ContainerDied","Data":"18a1d999c028a245479511118fd18d5623e31bf0ba1ed36a8923351ebd56c713"} Mar 16 00:34:34 crc kubenswrapper[4816]: I0316 00:34:34.879124 4816 generic.go:334] "Generic (PLEG): container finished" podID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerID="5d870941f7b3e569fa84c49a82db1748056ec3fdd6cc2b23128235b07e7d93c9" exitCode=0 Mar 16 00:34:34 crc kubenswrapper[4816]: I0316 00:34:34.879151 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerDied","Data":"5d870941f7b3e569fa84c49a82db1748056ec3fdd6cc2b23128235b07e7d93c9"} Mar 16 00:34:35 crc kubenswrapper[4816]: I0316 00:34:35.886392 4816 generic.go:334] "Generic (PLEG): container finished" podID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerID="9c05ec3de572be28234263125f6671d9ce90d8114ad7a91af13eef9b53e34db1" exitCode=0 Mar 16 00:34:35 crc kubenswrapper[4816]: I0316 00:34:35.886669 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerDied","Data":"9c05ec3de572be28234263125f6671d9ce90d8114ad7a91af13eef9b53e34db1"} Mar 16 00:34:35 crc kubenswrapper[4816]: I0316 00:34:35.888093 4816 generic.go:334] "Generic (PLEG): container finished" podID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerID="725cfe790af8de5097674d7a212a554bdd9fec74a150c10cb54be0bcef0edf7a" exitCode=0 Mar 16 00:34:35 crc kubenswrapper[4816]: I0316 00:34:35.888144 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" event={"ID":"f6754fbe-ac20-4fe6-8c87-6d30f20069b9","Type":"ContainerDied","Data":"725cfe790af8de5097674d7a212a554bdd9fec74a150c10cb54be0bcef0edf7a"} Mar 16 00:34:36 crc kubenswrapper[4816]: I0316 00:34:36.923953 4816 generic.go:334] "Generic (PLEG): container finished" podID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerID="f69b2062e753e9aeeaba47277e0394d00ff23e47737c987053cc8946cc8b6c75" exitCode=0 Mar 16 00:34:36 crc kubenswrapper[4816]: I0316 00:34:36.925156 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" event={"ID":"f6754fbe-ac20-4fe6-8c87-6d30f20069b9","Type":"ContainerDied","Data":"f69b2062e753e9aeeaba47277e0394d00ff23e47737c987053cc8946cc8b6c75"} Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.211445 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.392389 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle\") pod \"0b78944c-c894-4d7f-bbe3-82eee916db70\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.392522 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util\") pod \"0b78944c-c894-4d7f-bbe3-82eee916db70\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.393048 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle" (OuterVolumeSpecName: "bundle") pod "0b78944c-c894-4d7f-bbe3-82eee916db70" (UID: "0b78944c-c894-4d7f-bbe3-82eee916db70"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.397736 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkzgc\" (UniqueName: \"kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc\") pod \"0b78944c-c894-4d7f-bbe3-82eee916db70\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.398363 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.402106 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc" (OuterVolumeSpecName: "kube-api-access-qkzgc") pod "0b78944c-c894-4d7f-bbe3-82eee916db70" (UID: "0b78944c-c894-4d7f-bbe3-82eee916db70"). InnerVolumeSpecName "kube-api-access-qkzgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.413900 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util" (OuterVolumeSpecName: "util") pod "0b78944c-c894-4d7f-bbe3-82eee916db70" (UID: "0b78944c-c894-4d7f-bbe3-82eee916db70"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.499080 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.499121 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkzgc\" (UniqueName: \"kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.934538 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.934770 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerDied","Data":"d790c89c5f398f362a0ba954ed74496a86b4d207075a97a2dba1f9ba8fe354b1"} Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.934824 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d790c89c5f398f362a0ba954ed74496a86b4d207075a97a2dba1f9ba8fe354b1" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.234317 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.306898 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util\") pod \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.306958 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf99h\" (UniqueName: \"kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h\") pod \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.306994 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle\") pod \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.307646 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle" (OuterVolumeSpecName: "bundle") pod "f6754fbe-ac20-4fe6-8c87-6d30f20069b9" (UID: "f6754fbe-ac20-4fe6-8c87-6d30f20069b9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.316935 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h" (OuterVolumeSpecName: "kube-api-access-pf99h") pod "f6754fbe-ac20-4fe6-8c87-6d30f20069b9" (UID: "f6754fbe-ac20-4fe6-8c87-6d30f20069b9"). InnerVolumeSpecName "kube-api-access-pf99h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.321429 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util" (OuterVolumeSpecName: "util") pod "f6754fbe-ac20-4fe6-8c87-6d30f20069b9" (UID: "f6754fbe-ac20-4fe6-8c87-6d30f20069b9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.408027 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.408062 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf99h\" (UniqueName: \"kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.408074 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.944530 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" event={"ID":"f6754fbe-ac20-4fe6-8c87-6d30f20069b9","Type":"ContainerDied","Data":"add5543c2b87b62fcb6cc941f4e9d34042291c66893841625339e4cb436c7477"} Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.944901 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add5543c2b87b62fcb6cc941f4e9d34042291c66893841625339e4cb436c7477" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.944656 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747030 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-55c8479bdf-6m44w"] Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747577 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="util" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747595 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="util" Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747607 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="pull" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747615 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="pull" Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747628 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747635 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747651 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="util" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747657 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="util" Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747672 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="pull" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747678 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="pull" Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747686 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747695 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747813 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747828 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.748317 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.750573 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-95tfc" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.761725 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-55c8479bdf-6m44w"] Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.792442 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/e96079bc-73ba-420e-9568-cea10077c4ae-runner\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.792485 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsdw4\" (UniqueName: \"kubernetes.io/projected/e96079bc-73ba-420e-9568-cea10077c4ae-kube-api-access-wsdw4\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.894139 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/e96079bc-73ba-420e-9568-cea10077c4ae-runner\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.894207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsdw4\" (UniqueName: \"kubernetes.io/projected/e96079bc-73ba-420e-9568-cea10077c4ae-kube-api-access-wsdw4\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.894715 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/e96079bc-73ba-420e-9568-cea10077c4ae-runner\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.919880 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsdw4\" (UniqueName: \"kubernetes.io/projected/e96079bc-73ba-420e-9568-cea10077c4ae-kube-api-access-wsdw4\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:45 crc kubenswrapper[4816]: I0316 00:34:45.071721 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:45 crc kubenswrapper[4816]: I0316 00:34:45.561187 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-55c8479bdf-6m44w"] Mar 16 00:34:46 crc kubenswrapper[4816]: I0316 00:34:46.002542 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" event={"ID":"e96079bc-73ba-420e-9568-cea10077c4ae","Type":"ContainerStarted","Data":"ae492dbdf77984277c138e17866872cb0ae074398da44666fcd372b9ab93eb20"} Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.293767 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz"] Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.295827 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.301433 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-kx5vr" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.315323 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz"] Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.354960 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0faefde0-6740-414f-bf47-0d763a35b22f-runner\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.355017 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b79gs\" (UniqueName: \"kubernetes.io/projected/0faefde0-6740-414f-bf47-0d763a35b22f-kube-api-access-b79gs\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.455833 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0faefde0-6740-414f-bf47-0d763a35b22f-runner\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.455911 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b79gs\" (UniqueName: \"kubernetes.io/projected/0faefde0-6740-414f-bf47-0d763a35b22f-kube-api-access-b79gs\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.456347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0faefde0-6740-414f-bf47-0d763a35b22f-runner\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.477117 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b79gs\" (UniqueName: \"kubernetes.io/projected/0faefde0-6740-414f-bf47-0d763a35b22f-kube-api-access-b79gs\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.622872 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:57 crc kubenswrapper[4816]: I0316 00:34:57.843958 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz"] Mar 16 00:34:58 crc kubenswrapper[4816]: I0316 00:34:58.088015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" event={"ID":"0faefde0-6740-414f-bf47-0d763a35b22f","Type":"ContainerStarted","Data":"7ae4409471dad3fe81d11dad7876cb9dadb1f3b34b935cca3cb522c0fab14ad9"} Mar 16 00:35:01 crc kubenswrapper[4816]: E0316 00:35:01.034611 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:latest" Mar 16 00:35:01 crc kubenswrapper[4816]: E0316 00:35:01.035077 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1773621141,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wsdw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-55c8479bdf-6m44w_service-telemetry(e96079bc-73ba-420e-9568-cea10077c4ae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 16 00:35:01 crc kubenswrapper[4816]: E0316 00:35:01.036253 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" podUID="e96079bc-73ba-420e-9568-cea10077c4ae" Mar 16 00:35:01 crc kubenswrapper[4816]: E0316 00:35:01.105902 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:latest\\\"\"" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" podUID="e96079bc-73ba-420e-9568-cea10077c4ae" Mar 16 00:35:04 crc kubenswrapper[4816]: I0316 00:35:04.528898 4816 scope.go:117] "RemoveContainer" containerID="2184a6c7d5ea889f0c49670caabfc30e2cdb52bf2b9beb7864557d83b84bbb54" Mar 16 00:35:06 crc kubenswrapper[4816]: I0316 00:35:06.139298 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" event={"ID":"0faefde0-6740-414f-bf47-0d763a35b22f","Type":"ContainerStarted","Data":"5613cc1e49e0a61f32d364bd129d564af21046d3694be5c16aee6856f727d100"} Mar 16 00:35:06 crc kubenswrapper[4816]: I0316 00:35:06.162835 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" podStartSLOduration=10.647741651 podStartE2EDuration="18.162807307s" podCreationTimestamp="2026-03-16 00:34:48 +0000 UTC" firstStartedPulling="2026-03-16 00:34:57.888322692 +0000 UTC m=+1690.984622645" lastFinishedPulling="2026-03-16 00:35:05.403388308 +0000 UTC m=+1698.499688301" observedRunningTime="2026-03-16 00:35:06.16147027 +0000 UTC m=+1699.257770263" watchObservedRunningTime="2026-03-16 00:35:06.162807307 +0000 UTC m=+1699.259107300" Mar 16 00:35:17 crc kubenswrapper[4816]: I0316 00:35:17.228368 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" event={"ID":"e96079bc-73ba-420e-9568-cea10077c4ae","Type":"ContainerStarted","Data":"9ca787bd1ff2bd1d864f8b0766658d09f0733ad33d64614af7275cf5f52a93a0"} Mar 16 00:35:17 crc kubenswrapper[4816]: I0316 00:35:17.247329 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" podStartSLOduration=2.631006352 podStartE2EDuration="33.247305263s" podCreationTimestamp="2026-03-16 00:34:44 +0000 UTC" firstStartedPulling="2026-03-16 00:34:45.56848991 +0000 UTC m=+1678.664789863" lastFinishedPulling="2026-03-16 00:35:16.184788821 +0000 UTC m=+1709.281088774" observedRunningTime="2026-03-16 00:35:17.245241285 +0000 UTC m=+1710.341541238" watchObservedRunningTime="2026-03-16 00:35:17.247305263 +0000 UTC m=+1710.343605226" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.916878 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.918087 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.921330 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.921417 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-j42x2" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.921442 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.922033 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.922036 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.923548 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.923953 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.943394 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041094 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041166 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041199 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041216 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041249 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041391 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041462 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144246 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144326 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144357 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144381 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144397 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144436 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144456 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.145801 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.153963 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.154195 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.168142 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.169249 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.170360 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.184460 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.231895 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.633501 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:35:31 crc kubenswrapper[4816]: W0316 00:35:31.638260 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91573536_f8d4_475f_bfb6_dd2ad9910ce0.slice/crio-26dc3f4ee67e3496e2e4a3275b5368fe7536b3be135627eecf62f5d01c3d1d56 WatchSource:0}: Error finding container 26dc3f4ee67e3496e2e4a3275b5368fe7536b3be135627eecf62f5d01c3d1d56: Status 404 returned error can't find the container with id 26dc3f4ee67e3496e2e4a3275b5368fe7536b3be135627eecf62f5d01c3d1d56 Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.863981 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.864059 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:35:32 crc kubenswrapper[4816]: I0316 00:35:32.341795 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" event={"ID":"91573536-f8d4-475f-bfb6-dd2ad9910ce0","Type":"ContainerStarted","Data":"26dc3f4ee67e3496e2e4a3275b5368fe7536b3be135627eecf62f5d01c3d1d56"} Mar 16 00:35:38 crc kubenswrapper[4816]: I0316 00:35:38.388326 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" event={"ID":"91573536-f8d4-475f-bfb6-dd2ad9910ce0","Type":"ContainerStarted","Data":"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3"} Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.078085 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" podStartSLOduration=5.421713619 podStartE2EDuration="11.078060946s" podCreationTimestamp="2026-03-16 00:35:30 +0000 UTC" firstStartedPulling="2026-03-16 00:35:31.639796232 +0000 UTC m=+1724.736096185" lastFinishedPulling="2026-03-16 00:35:37.296143519 +0000 UTC m=+1730.392443512" observedRunningTime="2026-03-16 00:35:38.434086869 +0000 UTC m=+1731.530386832" watchObservedRunningTime="2026-03-16 00:35:41.078060946 +0000 UTC m=+1734.174360899" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.083049 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.084457 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087622 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087628 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087629 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087623 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087794 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087661 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087759 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.088148 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.088261 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.088370 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-hcjdg" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.094619 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184688 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-web-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184720 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184780 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/078376fd-a0f8-4157-8a07-23ce85695dc6-config-out\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184826 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9ldj\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-kube-api-access-r9ldj\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184844 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184858 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184907 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184927 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184991 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.185040 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-tls-assets\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.185057 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.185233 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286257 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286316 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-web-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286333 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286354 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/078376fd-a0f8-4157-8a07-23ce85695dc6-config-out\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286382 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9ldj\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-kube-api-access-r9ldj\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286403 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286420 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286443 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286464 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286493 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286520 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-tls-assets\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286538 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: E0316 00:35:41.286744 4816 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 16 00:35:41 crc kubenswrapper[4816]: E0316 00:35:41.286815 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls podName:078376fd-a0f8-4157-8a07-23ce85695dc6 nodeName:}" failed. No retries permitted until 2026-03-16 00:35:41.786794936 +0000 UTC m=+1734.883094889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "078376fd-a0f8-4157-8a07-23ce85695dc6") : secret "default-prometheus-proxy-tls" not found Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.287285 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.287454 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.287515 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.288127 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.289819 4816 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.289844 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f03051eddd5b0320ee2397f728e71018fe21f91cbc5c5dd1c3d97248c518ba7/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.292143 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-tls-assets\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.292160 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.293657 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/078376fd-a0f8-4157-8a07-23ce85695dc6-config-out\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.297987 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.305296 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9ldj\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-kube-api-access-r9ldj\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.310976 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.312091 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-web-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.792535 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: E0316 00:35:41.792763 4816 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 16 00:35:41 crc kubenswrapper[4816]: E0316 00:35:41.792936 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls podName:078376fd-a0f8-4157-8a07-23ce85695dc6 nodeName:}" failed. No retries permitted until 2026-03-16 00:35:42.792911607 +0000 UTC m=+1735.889211590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "078376fd-a0f8-4157-8a07-23ce85695dc6") : secret "default-prometheus-proxy-tls" not found Mar 16 00:35:42 crc kubenswrapper[4816]: I0316 00:35:42.804804 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:42 crc kubenswrapper[4816]: I0316 00:35:42.814454 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:42 crc kubenswrapper[4816]: I0316 00:35:42.901050 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 16 00:35:43 crc kubenswrapper[4816]: I0316 00:35:43.184809 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:35:43 crc kubenswrapper[4816]: I0316 00:35:43.423898 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerStarted","Data":"6bf29a457db28cca4c02bce3433651e24d35d93b2613dab3606cc5069dc54224"} Mar 16 00:35:48 crc kubenswrapper[4816]: I0316 00:35:48.480163 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerStarted","Data":"ad5d6ee1ddaedfce29f1a42c7783da7e5b45d684fe8d750e463ea53229b88e8e"} Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.086623 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-2n6hc"] Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.087799 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.119763 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlhrl\" (UniqueName: \"kubernetes.io/projected/5be5ca83-5116-48dc-8d6c-733cbd3e9682-kube-api-access-rlhrl\") pod \"default-snmp-webhook-6856cfb745-2n6hc\" (UID: \"5be5ca83-5116-48dc-8d6c-733cbd3e9682\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.138825 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-2n6hc"] Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.220458 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlhrl\" (UniqueName: \"kubernetes.io/projected/5be5ca83-5116-48dc-8d6c-733cbd3e9682-kube-api-access-rlhrl\") pod \"default-snmp-webhook-6856cfb745-2n6hc\" (UID: \"5be5ca83-5116-48dc-8d6c-733cbd3e9682\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.239775 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlhrl\" (UniqueName: \"kubernetes.io/projected/5be5ca83-5116-48dc-8d6c-733cbd3e9682-kube-api-access-rlhrl\") pod \"default-snmp-webhook-6856cfb745-2n6hc\" (UID: \"5be5ca83-5116-48dc-8d6c-733cbd3e9682\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.404522 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.856675 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-2n6hc"] Mar 16 00:35:52 crc kubenswrapper[4816]: I0316 00:35:52.508922 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" event={"ID":"5be5ca83-5116-48dc-8d6c-733cbd3e9682","Type":"ContainerStarted","Data":"9a41732642c14a0983b9e80b2ccd79de5ab039325162e3f4d598cc4a04912d7f"} Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.547582 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.550836 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.553354 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.553573 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.554017 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.554148 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.554279 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.561189 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-tg86t" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.571848 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674244 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674283 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-out\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674343 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-web-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674363 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-volume\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674432 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75xg\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-kube-api-access-j75xg\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674491 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674513 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674567 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775502 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775560 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775583 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775648 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775663 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-out\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775706 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-web-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775733 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-volume\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775766 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775787 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j75xg\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-kube-api-access-j75xg\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: E0316 00:35:54.776458 4816 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:54 crc kubenswrapper[4816]: E0316 00:35:54.776506 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls podName:f4698c34-c93e-4d6f-8ab8-2bfcf3118410 nodeName:}" failed. No retries permitted until 2026-03-16 00:35:55.276491789 +0000 UTC m=+1748.372791742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f4698c34-c93e-4d6f-8ab8-2bfcf3118410") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.781618 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-web-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.782192 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.784100 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-volume\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.784534 4816 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.784578 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4ee44c0f5a301a68cae099f4f571c2cc1fd272d329d5eaa334a496c29057408e/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.785763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.786133 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.792761 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-out\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.795033 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75xg\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-kube-api-access-j75xg\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.824703 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:55 crc kubenswrapper[4816]: I0316 00:35:55.283250 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:55 crc kubenswrapper[4816]: E0316 00:35:55.283446 4816 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:55 crc kubenswrapper[4816]: E0316 00:35:55.283530 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls podName:f4698c34-c93e-4d6f-8ab8-2bfcf3118410 nodeName:}" failed. No retries permitted until 2026-03-16 00:35:56.283512506 +0000 UTC m=+1749.379812449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f4698c34-c93e-4d6f-8ab8-2bfcf3118410") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:55 crc kubenswrapper[4816]: I0316 00:35:55.534355 4816 generic.go:334] "Generic (PLEG): container finished" podID="078376fd-a0f8-4157-8a07-23ce85695dc6" containerID="ad5d6ee1ddaedfce29f1a42c7783da7e5b45d684fe8d750e463ea53229b88e8e" exitCode=0 Mar 16 00:35:55 crc kubenswrapper[4816]: I0316 00:35:55.534396 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerDied","Data":"ad5d6ee1ddaedfce29f1a42c7783da7e5b45d684fe8d750e463ea53229b88e8e"} Mar 16 00:35:56 crc kubenswrapper[4816]: I0316 00:35:56.296759 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:56 crc kubenswrapper[4816]: E0316 00:35:56.296927 4816 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:56 crc kubenswrapper[4816]: E0316 00:35:56.297113 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls podName:f4698c34-c93e-4d6f-8ab8-2bfcf3118410 nodeName:}" failed. No retries permitted until 2026-03-16 00:35:58.297091637 +0000 UTC m=+1751.393391590 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f4698c34-c93e-4d6f-8ab8-2bfcf3118410") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:58 crc kubenswrapper[4816]: I0316 00:35:58.325627 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:58 crc kubenswrapper[4816]: I0316 00:35:58.335441 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:58 crc kubenswrapper[4816]: I0316 00:35:58.477872 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:59 crc kubenswrapper[4816]: I0316 00:35:59.271731 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:35:59 crc kubenswrapper[4816]: W0316 00:35:59.275072 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4698c34_c93e_4d6f_8ab8_2bfcf3118410.slice/crio-79cc4306d089725a98fc0e8dd5dbf2e63a5f38f1a2fb06a495d7a2bcb49ed0d7 WatchSource:0}: Error finding container 79cc4306d089725a98fc0e8dd5dbf2e63a5f38f1a2fb06a495d7a2bcb49ed0d7: Status 404 returned error can't find the container with id 79cc4306d089725a98fc0e8dd5dbf2e63a5f38f1a2fb06a495d7a2bcb49ed0d7 Mar 16 00:35:59 crc kubenswrapper[4816]: I0316 00:35:59.576353 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" event={"ID":"5be5ca83-5116-48dc-8d6c-733cbd3e9682","Type":"ContainerStarted","Data":"1d2f4d2e87dd59ce19fa126c3b86c4e809d1d1b48ecda31a05892b90fe315aa5"} Mar 16 00:35:59 crc kubenswrapper[4816]: I0316 00:35:59.577254 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerStarted","Data":"79cc4306d089725a98fc0e8dd5dbf2e63a5f38f1a2fb06a495d7a2bcb49ed0d7"} Mar 16 00:35:59 crc kubenswrapper[4816]: I0316 00:35:59.594128 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" podStartSLOduration=1.244467202 podStartE2EDuration="8.59409859s" podCreationTimestamp="2026-03-16 00:35:51 +0000 UTC" firstStartedPulling="2026-03-16 00:35:51.86165709 +0000 UTC m=+1744.957957053" lastFinishedPulling="2026-03-16 00:35:59.211288488 +0000 UTC m=+1752.307588441" observedRunningTime="2026-03-16 00:35:59.587874484 +0000 UTC m=+1752.684174437" watchObservedRunningTime="2026-03-16 00:35:59.59409859 +0000 UTC m=+1752.690398543" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.128490 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560356-vl86r"] Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.129475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.132454 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.132525 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.132454 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.141038 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-vl86r"] Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.191418 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492kg\" (UniqueName: \"kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg\") pod \"auto-csr-approver-29560356-vl86r\" (UID: \"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb\") " pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.293022 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-492kg\" (UniqueName: \"kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg\") pod \"auto-csr-approver-29560356-vl86r\" (UID: \"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb\") " pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.315195 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-492kg\" (UniqueName: \"kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg\") pod \"auto-csr-approver-29560356-vl86r\" (UID: \"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb\") " pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.508931 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:01 crc kubenswrapper[4816]: I0316 00:36:01.592235 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerStarted","Data":"ee1e85a329cf517225a11b56c35c0a2882b2081d4fc2c84d9ff8eeaf4e90e1da"} Mar 16 00:36:01 crc kubenswrapper[4816]: I0316 00:36:01.863089 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:36:01 crc kubenswrapper[4816]: I0316 00:36:01.863150 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:36:03 crc kubenswrapper[4816]: I0316 00:36:03.175062 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-vl86r"] Mar 16 00:36:03 crc kubenswrapper[4816]: I0316 00:36:03.608175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerStarted","Data":"732049fbfa9d0a365a241a1d58f29d95ffbd1ab9e8ed904c2672dbf9e157a3e0"} Mar 16 00:36:03 crc kubenswrapper[4816]: I0316 00:36:03.609763 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-vl86r" event={"ID":"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb","Type":"ContainerStarted","Data":"7438600c9a8d11afa86633eb3a1d44cf1dfa3d95de38221bd7c8d0e7539f9b23"} Mar 16 00:36:04 crc kubenswrapper[4816]: I0316 00:36:04.619328 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-vl86r" event={"ID":"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb","Type":"ContainerStarted","Data":"a31a7a7d2a1fafc20c9ab619317cd87262faab560369319826f6c184261023c4"} Mar 16 00:36:04 crc kubenswrapper[4816]: I0316 00:36:04.645733 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560356-vl86r" podStartSLOduration=3.676952788 podStartE2EDuration="4.645709474s" podCreationTimestamp="2026-03-16 00:36:00 +0000 UTC" firstStartedPulling="2026-03-16 00:36:03.180700865 +0000 UTC m=+1756.277000818" lastFinishedPulling="2026-03-16 00:36:04.149457531 +0000 UTC m=+1757.245757504" observedRunningTime="2026-03-16 00:36:04.634623551 +0000 UTC m=+1757.730923514" watchObservedRunningTime="2026-03-16 00:36:04.645709474 +0000 UTC m=+1757.742009427" Mar 16 00:36:05 crc kubenswrapper[4816]: I0316 00:36:05.628184 4816 generic.go:334] "Generic (PLEG): container finished" podID="0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" containerID="a31a7a7d2a1fafc20c9ab619317cd87262faab560369319826f6c184261023c4" exitCode=0 Mar 16 00:36:05 crc kubenswrapper[4816]: I0316 00:36:05.628289 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-vl86r" event={"ID":"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb","Type":"ContainerDied","Data":"a31a7a7d2a1fafc20c9ab619317cd87262faab560369319826f6c184261023c4"} Mar 16 00:36:05 crc kubenswrapper[4816]: I0316 00:36:05.634177 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerStarted","Data":"f829b387c5d2c80c4804c84b02a83f69b186ab0515fa16f884633c0155dce723"} Mar 16 00:36:06 crc kubenswrapper[4816]: I0316 00:36:06.905848 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.088097 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-492kg\" (UniqueName: \"kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg\") pod \"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb\" (UID: \"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb\") " Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.107748 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg" (OuterVolumeSpecName: "kube-api-access-492kg") pod "0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" (UID: "0ffa2c59-99b2-4d5c-9c19-f0921ce688cb"). InnerVolumeSpecName "kube-api-access-492kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.189699 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-492kg\" (UniqueName: \"kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.650400 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-vl86r" event={"ID":"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb","Type":"ContainerDied","Data":"7438600c9a8d11afa86633eb3a1d44cf1dfa3d95de38221bd7c8d0e7539f9b23"} Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.650491 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7438600c9a8d11afa86633eb3a1d44cf1dfa3d95de38221bd7c8d0e7539f9b23" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.652944 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.652974 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4698c34-c93e-4d6f-8ab8-2bfcf3118410" containerID="ee1e85a329cf517225a11b56c35c0a2882b2081d4fc2c84d9ff8eeaf4e90e1da" exitCode=0 Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.652997 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerDied","Data":"ee1e85a329cf517225a11b56c35c0a2882b2081d4fc2c84d9ff8eeaf4e90e1da"} Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.701804 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-6dpp5"] Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.706708 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-6dpp5"] Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.791978 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t"] Mar 16 00:36:07 crc kubenswrapper[4816]: E0316 00:36:07.792307 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" containerName="oc" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.792321 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" containerName="oc" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.792483 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" containerName="oc" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.793368 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.795480 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-kjf9t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.795802 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.795906 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.797252 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.804051 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t"] Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.900562 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.900616 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4de6f751-2471-4ce9-a771-00703e7be02a-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.900690 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4de6f751-2471-4ce9-a771-00703e7be02a-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.900710 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65dq\" (UniqueName: \"kubernetes.io/projected/4de6f751-2471-4ce9-a771-00703e7be02a-kube-api-access-w65dq\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.900775 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.001520 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4de6f751-2471-4ce9-a771-00703e7be02a-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.001579 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w65dq\" (UniqueName: \"kubernetes.io/projected/4de6f751-2471-4ce9-a771-00703e7be02a-kube-api-access-w65dq\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.001627 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.001661 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.001684 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4de6f751-2471-4ce9-a771-00703e7be02a-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: E0316 00:36:08.002055 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:36:08 crc kubenswrapper[4816]: E0316 00:36:08.002108 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls podName:4de6f751-2471-4ce9-a771-00703e7be02a nodeName:}" failed. No retries permitted until 2026-03-16 00:36:08.502093062 +0000 UTC m=+1761.598393015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" (UID: "4de6f751-2471-4ce9-a771-00703e7be02a") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.002180 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4de6f751-2471-4ce9-a771-00703e7be02a-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.002838 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4de6f751-2471-4ce9-a771-00703e7be02a-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.006593 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.017110 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65dq\" (UniqueName: \"kubernetes.io/projected/4de6f751-2471-4ce9-a771-00703e7be02a-kube-api-access-w65dq\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.508471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: E0316 00:36:08.508689 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:36:08 crc kubenswrapper[4816]: E0316 00:36:08.508955 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls podName:4de6f751-2471-4ce9-a771-00703e7be02a nodeName:}" failed. No retries permitted until 2026-03-16 00:36:09.508935854 +0000 UTC m=+1762.605235807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" (UID: "4de6f751-2471-4ce9-a771-00703e7be02a") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:36:09 crc kubenswrapper[4816]: I0316 00:36:09.521387 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:09 crc kubenswrapper[4816]: I0316 00:36:09.528310 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:09 crc kubenswrapper[4816]: I0316 00:36:09.613535 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:09 crc kubenswrapper[4816]: I0316 00:36:09.678722 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12bfc435-89c2-4917-9bb6-cc2e9eca440c" path="/var/lib/kubelet/pods/12bfc435-89c2-4917-9bb6-cc2e9eca440c/volumes" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.070958 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v"] Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.079498 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.088037 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.088657 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.111806 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v"] Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.234091 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7c7b38e-dd7e-469c-ab38-173944ca2943-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.234350 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk9q7\" (UniqueName: \"kubernetes.io/projected/a7c7b38e-dd7e-469c-ab38-173944ca2943-kube-api-access-xk9q7\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.234380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a7c7b38e-dd7e-469c-ab38-173944ca2943-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.234445 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.234482 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.335743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7c7b38e-dd7e-469c-ab38-173944ca2943-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.335785 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk9q7\" (UniqueName: \"kubernetes.io/projected/a7c7b38e-dd7e-469c-ab38-173944ca2943-kube-api-access-xk9q7\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.335817 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a7c7b38e-dd7e-469c-ab38-173944ca2943-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.335884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.335925 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: E0316 00:36:10.336226 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:36:10 crc kubenswrapper[4816]: E0316 00:36:10.336303 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls podName:a7c7b38e-dd7e-469c-ab38-173944ca2943 nodeName:}" failed. No retries permitted until 2026-03-16 00:36:10.836284717 +0000 UTC m=+1763.932584660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" (UID: "a7c7b38e-dd7e-469c-ab38-173944ca2943") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.336886 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7c7b38e-dd7e-469c-ab38-173944ca2943-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.337223 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a7c7b38e-dd7e-469c-ab38-173944ca2943-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.348107 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.353322 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk9q7\" (UniqueName: \"kubernetes.io/projected/a7c7b38e-dd7e-469c-ab38-173944ca2943-kube-api-access-xk9q7\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.851360 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: E0316 00:36:10.851575 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:36:10 crc kubenswrapper[4816]: E0316 00:36:10.851785 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls podName:a7c7b38e-dd7e-469c-ab38-173944ca2943 nodeName:}" failed. No retries permitted until 2026-03-16 00:36:11.851767802 +0000 UTC m=+1764.948067755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" (UID: "a7c7b38e-dd7e-469c-ab38-173944ca2943") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:36:11 crc kubenswrapper[4816]: I0316 00:36:11.864333 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:11 crc kubenswrapper[4816]: I0316 00:36:11.871815 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:11 crc kubenswrapper[4816]: I0316 00:36:11.912473 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:12 crc kubenswrapper[4816]: I0316 00:36:12.414326 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t"] Mar 16 00:36:12 crc kubenswrapper[4816]: I0316 00:36:12.474569 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v"] Mar 16 00:36:13 crc kubenswrapper[4816]: W0316 00:36:13.003883 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4de6f751_2471_4ce9_a771_00703e7be02a.slice/crio-5e33734c707e8f6159a1f1a62d31885cbef60b497c91cbefd8fbc54cc6cef8f3 WatchSource:0}: Error finding container 5e33734c707e8f6159a1f1a62d31885cbef60b497c91cbefd8fbc54cc6cef8f3: Status 404 returned error can't find the container with id 5e33734c707e8f6159a1f1a62d31885cbef60b497c91cbefd8fbc54cc6cef8f3 Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.606293 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826"] Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.607744 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.613186 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.617111 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.623176 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826"] Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.691459 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"5e33734c707e8f6159a1f1a62d31885cbef60b497c91cbefd8fbc54cc6cef8f3"} Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.693396 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerStarted","Data":"59a7711ad7908a9dcaa1092ab168fd24e8dc87ebad32c5c5d835eb703fbf9df0"} Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.694984 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"c6997e782546bb7636f289fcb9df20186bc0a72c1f6bdabbe999d35fdb7ee8b1"} Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.697344 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerStarted","Data":"063d12ce8565ba6c45553a0e36f210c5d4eca03103480ebab524025fbbba362d"} Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.727513 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=3.855844548 podStartE2EDuration="33.727495658s" podCreationTimestamp="2026-03-16 00:35:40 +0000 UTC" firstStartedPulling="2026-03-16 00:35:43.191757439 +0000 UTC m=+1736.288057392" lastFinishedPulling="2026-03-16 00:36:13.063408549 +0000 UTC m=+1766.159708502" observedRunningTime="2026-03-16 00:36:13.724326108 +0000 UTC m=+1766.820626061" watchObservedRunningTime="2026-03-16 00:36:13.727495658 +0000 UTC m=+1766.823795611" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.790908 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee673348-980c-44f5-8e33-71a859ce740c-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.790976 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frmkq\" (UniqueName: \"kubernetes.io/projected/ee673348-980c-44f5-8e33-71a859ce740c-kube-api-access-frmkq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.791046 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.791171 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ee673348-980c-44f5-8e33-71a859ce740c-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.791240 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.892538 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ee673348-980c-44f5-8e33-71a859ce740c-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.892624 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.892655 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee673348-980c-44f5-8e33-71a859ce740c-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.892677 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmkq\" (UniqueName: \"kubernetes.io/projected/ee673348-980c-44f5-8e33-71a859ce740c-kube-api-access-frmkq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.892718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: E0316 00:36:13.892835 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:36:13 crc kubenswrapper[4816]: E0316 00:36:13.892895 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls podName:ee673348-980c-44f5-8e33-71a859ce740c nodeName:}" failed. No retries permitted until 2026-03-16 00:36:14.392880345 +0000 UTC m=+1767.489180298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" (UID: "ee673348-980c-44f5-8e33-71a859ce740c") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.893524 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee673348-980c-44f5-8e33-71a859ce740c-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.896206 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ee673348-980c-44f5-8e33-71a859ce740c-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.901141 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.913117 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmkq\" (UniqueName: \"kubernetes.io/projected/ee673348-980c-44f5-8e33-71a859ce740c-kube-api-access-frmkq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:14 crc kubenswrapper[4816]: I0316 00:36:14.400265 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:14 crc kubenswrapper[4816]: E0316 00:36:14.400486 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:36:14 crc kubenswrapper[4816]: E0316 00:36:14.400623 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls podName:ee673348-980c-44f5-8e33-71a859ce740c nodeName:}" failed. No retries permitted until 2026-03-16 00:36:15.400598101 +0000 UTC m=+1768.496898054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" (UID: "ee673348-980c-44f5-8e33-71a859ce740c") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:36:14 crc kubenswrapper[4816]: I0316 00:36:14.709129 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"5e6adc7c1ceb4dee9d75da1b60f5314b5d489eb5abca26b0e1e9530c29a279af"} Mar 16 00:36:14 crc kubenswrapper[4816]: I0316 00:36:14.714636 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"48148e28d8fe141bbe5a156cd9d78d73b91d4be809f4c72cc046d7ef3f240d1f"} Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.418171 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.424376 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.721338 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.725989 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"f9d1d4fbe1538ba3d5bfad270eb0c25d99d3990476fb2033e0544e3446cf984e"} Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.728445 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerStarted","Data":"8588f16d5ec005f836154019cd40949f4d783d7ff7a353de6aa46aead0c27846"} Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.730349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"38d2258b8e0ac413732a26743c1c7812f0e79b91c6b5bd0ac251d5ae2c83223c"} Mar 16 00:36:16 crc kubenswrapper[4816]: I0316 00:36:16.225725 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826"] Mar 16 00:36:16 crc kubenswrapper[4816]: W0316 00:36:16.238175 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee673348_980c_44f5_8e33_71a859ce740c.slice/crio-3b07e4f45a2e03a71adc0a786082d1632e61b6814483c597e3c68c14c7ad5d1c WatchSource:0}: Error finding container 3b07e4f45a2e03a71adc0a786082d1632e61b6814483c597e3c68c14c7ad5d1c: Status 404 returned error can't find the container with id 3b07e4f45a2e03a71adc0a786082d1632e61b6814483c597e3c68c14c7ad5d1c Mar 16 00:36:16 crc kubenswrapper[4816]: I0316 00:36:16.744173 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerStarted","Data":"4903f66673eb9543339774d9ac0280a8999d0232a84a23d8a55a030d79be9dbe"} Mar 16 00:36:16 crc kubenswrapper[4816]: I0316 00:36:16.745727 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"3b07e4f45a2e03a71adc0a786082d1632e61b6814483c597e3c68c14c7ad5d1c"} Mar 16 00:36:16 crc kubenswrapper[4816]: I0316 00:36:16.771252 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=15.579575346 podStartE2EDuration="23.771235254s" podCreationTimestamp="2026-03-16 00:35:53 +0000 UTC" firstStartedPulling="2026-03-16 00:36:07.655468051 +0000 UTC m=+1760.751768004" lastFinishedPulling="2026-03-16 00:36:15.847127959 +0000 UTC m=+1768.943427912" observedRunningTime="2026-03-16 00:36:16.765890654 +0000 UTC m=+1769.862190607" watchObservedRunningTime="2026-03-16 00:36:16.771235254 +0000 UTC m=+1769.867535207" Mar 16 00:36:17 crc kubenswrapper[4816]: I0316 00:36:17.755995 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"cbce52285f31a47858a3995de5e0aeab1ce413b10f830c469df5d563447b72c3"} Mar 16 00:36:17 crc kubenswrapper[4816]: I0316 00:36:17.902164 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 16 00:36:20 crc kubenswrapper[4816]: I0316 00:36:20.881242 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc"] Mar 16 00:36:20 crc kubenswrapper[4816]: I0316 00:36:20.882920 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:20 crc kubenswrapper[4816]: I0316 00:36:20.885584 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 16 00:36:20 crc kubenswrapper[4816]: I0316 00:36:20.885828 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 16 00:36:20 crc kubenswrapper[4816]: I0316 00:36:20.891319 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc"] Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.037520 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.037599 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.037624 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcb4d\" (UniqueName: \"kubernetes.io/projected/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-kube-api-access-bcb4d\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.037663 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.139243 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.139925 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.140086 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.140211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcb4d\" (UniqueName: \"kubernetes.io/projected/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-kube-api-access-bcb4d\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.139705 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.140937 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.155216 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.164722 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcb4d\" (UniqueName: \"kubernetes.io/projected/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-kube-api-access-bcb4d\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.261394 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.827904 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts"] Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.829455 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.831725 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.838626 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts"] Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.950933 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/134459cc-413b-4996-a0c1-aafe8dae8ebb-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.950999 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/134459cc-413b-4996-a0c1-aafe8dae8ebb-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.951043 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/134459cc-413b-4996-a0c1-aafe8dae8ebb-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.951089 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm5lr\" (UniqueName: \"kubernetes.io/projected/134459cc-413b-4996-a0c1-aafe8dae8ebb-kube-api-access-nm5lr\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.052278 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/134459cc-413b-4996-a0c1-aafe8dae8ebb-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.052342 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/134459cc-413b-4996-a0c1-aafe8dae8ebb-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.052387 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/134459cc-413b-4996-a0c1-aafe8dae8ebb-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.052434 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm5lr\" (UniqueName: \"kubernetes.io/projected/134459cc-413b-4996-a0c1-aafe8dae8ebb-kube-api-access-nm5lr\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.053134 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/134459cc-413b-4996-a0c1-aafe8dae8ebb-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.053479 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/134459cc-413b-4996-a0c1-aafe8dae8ebb-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.055876 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/134459cc-413b-4996-a0c1-aafe8dae8ebb-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.071320 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm5lr\" (UniqueName: \"kubernetes.io/projected/134459cc-413b-4996-a0c1-aafe8dae8ebb-kube-api-access-nm5lr\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.153951 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.343515 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts"] Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.401711 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc"] Mar 16 00:36:27 crc kubenswrapper[4816]: W0316 00:36:27.412132 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71ca3af9_1a2f_4bd2_898a_13c9089b16c2.slice/crio-bb68e8ff2fc4652bd7124872255f8c94e26e82f633c1abdf93bc90d074e3d742 WatchSource:0}: Error finding container bb68e8ff2fc4652bd7124872255f8c94e26e82f633c1abdf93bc90d074e3d742: Status 404 returned error can't find the container with id bb68e8ff2fc4652bd7124872255f8c94e26e82f633c1abdf93bc90d074e3d742 Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.823475 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"1b2f537e39ddc44f71e65578b7748d4fdf3a00f92f7162e6c40e1d62de0821f7"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.825264 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerStarted","Data":"ce9e99a9c02686068162a7b07011e1f6804e66a5e36c18c8a59058d037d454f1"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.825294 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerStarted","Data":"bb68e8ff2fc4652bd7124872255f8c94e26e82f633c1abdf93bc90d074e3d742"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.827733 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"a93224c7f0c626d700e911445e6c6addd4f7c4e8be9b3890ffa81e6e0d5c2d7d"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.827768 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"4d8e0927fe2232f6e091b4714aa5409aa783badeda2f3341d66e596881120b91"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.830282 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"7612b551538e59dc2650d3886c84b285d186f59576fce11885d1f9d53d3a74b2"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.831711 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerStarted","Data":"5d592ef65f51434a32e7ae05e9ef625d7d9825e6f8d4a65e1163ca86b1e07ca1"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.831739 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerStarted","Data":"48260c064384b04eabf949717c870cc6fe73c60cf45eba7e46ed0c8dc99d1dc5"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.846207 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" podStartSLOduration=3.868825257 podStartE2EDuration="17.846187881s" podCreationTimestamp="2026-03-16 00:36:10 +0000 UTC" firstStartedPulling="2026-03-16 00:36:13.160460158 +0000 UTC m=+1766.256760111" lastFinishedPulling="2026-03-16 00:36:27.137822772 +0000 UTC m=+1780.234122735" observedRunningTime="2026-03-16 00:36:27.839295416 +0000 UTC m=+1780.935595369" watchObservedRunningTime="2026-03-16 00:36:27.846187881 +0000 UTC m=+1780.942487844" Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.869857 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" podStartSLOduration=6.93558617 podStartE2EDuration="20.869838968s" podCreationTimestamp="2026-03-16 00:36:07 +0000 UTC" firstStartedPulling="2026-03-16 00:36:13.160798657 +0000 UTC m=+1766.257098610" lastFinishedPulling="2026-03-16 00:36:27.095051455 +0000 UTC m=+1780.191351408" observedRunningTime="2026-03-16 00:36:27.866413761 +0000 UTC m=+1780.962713714" watchObservedRunningTime="2026-03-16 00:36:27.869838968 +0000 UTC m=+1780.966138931" Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.900063 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" podStartSLOduration=3.6838836390000003 podStartE2EDuration="14.90003608s" podCreationTimestamp="2026-03-16 00:36:13 +0000 UTC" firstStartedPulling="2026-03-16 00:36:16.2426962 +0000 UTC m=+1769.338996163" lastFinishedPulling="2026-03-16 00:36:27.458848651 +0000 UTC m=+1780.555148604" observedRunningTime="2026-03-16 00:36:27.894635557 +0000 UTC m=+1780.990935510" watchObservedRunningTime="2026-03-16 00:36:27.90003608 +0000 UTC m=+1780.996336033" Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.903527 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.954740 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 16 00:36:28 crc kubenswrapper[4816]: I0316 00:36:28.840718 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerStarted","Data":"15495ca1215534138ba301daf0e5d9775679806317dd191e3f541ef7002473a8"} Mar 16 00:36:28 crc kubenswrapper[4816]: I0316 00:36:28.843118 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerStarted","Data":"3b42bffd699bcd4c96385d701e9f370ec5fcc2d09a04c176fa9fda031519a60d"} Mar 16 00:36:28 crc kubenswrapper[4816]: I0316 00:36:28.872912 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" podStartSLOduration=7.48546219 podStartE2EDuration="7.872888382s" podCreationTimestamp="2026-03-16 00:36:21 +0000 UTC" firstStartedPulling="2026-03-16 00:36:27.352467169 +0000 UTC m=+1780.448767122" lastFinishedPulling="2026-03-16 00:36:27.739893361 +0000 UTC m=+1780.836193314" observedRunningTime="2026-03-16 00:36:28.871390409 +0000 UTC m=+1781.967690402" watchObservedRunningTime="2026-03-16 00:36:28.872888382 +0000 UTC m=+1781.969188375" Mar 16 00:36:28 crc kubenswrapper[4816]: I0316 00:36:28.877511 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" podStartSLOduration=8.554112367 podStartE2EDuration="8.877495232s" podCreationTimestamp="2026-03-16 00:36:20 +0000 UTC" firstStartedPulling="2026-03-16 00:36:27.4155761 +0000 UTC m=+1780.511876053" lastFinishedPulling="2026-03-16 00:36:27.738958965 +0000 UTC m=+1780.835258918" observedRunningTime="2026-03-16 00:36:28.856028636 +0000 UTC m=+1781.952328609" watchObservedRunningTime="2026-03-16 00:36:28.877495232 +0000 UTC m=+1781.973795215" Mar 16 00:36:28 crc kubenswrapper[4816]: I0316 00:36:28.894235 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 16 00:36:31 crc kubenswrapper[4816]: I0316 00:36:31.862958 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:36:31 crc kubenswrapper[4816]: I0316 00:36:31.863473 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:36:31 crc kubenswrapper[4816]: I0316 00:36:31.863517 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:36:31 crc kubenswrapper[4816]: I0316 00:36:31.864114 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:36:31 crc kubenswrapper[4816]: I0316 00:36:31.864156 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" gracePeriod=600 Mar 16 00:36:32 crc kubenswrapper[4816]: E0316 00:36:32.001298 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:36:32 crc kubenswrapper[4816]: I0316 00:36:32.880318 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" exitCode=0 Mar 16 00:36:32 crc kubenswrapper[4816]: I0316 00:36:32.880408 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a"} Mar 16 00:36:32 crc kubenswrapper[4816]: I0316 00:36:32.880690 4816 scope.go:117] "RemoveContainer" containerID="92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071" Mar 16 00:36:32 crc kubenswrapper[4816]: I0316 00:36:32.881644 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:36:32 crc kubenswrapper[4816]: E0316 00:36:32.881941 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:36:33 crc kubenswrapper[4816]: I0316 00:36:33.972831 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:36:33 crc kubenswrapper[4816]: I0316 00:36:33.973096 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" podUID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" containerName="default-interconnect" containerID="cri-o://b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3" gracePeriod=30 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.338804 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.449877 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.449980 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450025 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450131 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450185 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450215 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450250 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450769 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.451398 4816 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.455652 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q" (OuterVolumeSpecName: "kube-api-access-mth5q") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "kube-api-access-mth5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.455709 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.456537 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.459708 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.459950 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.461662 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552611 4816 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552648 4816 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552661 4816 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552672 4816 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552683 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552692 4816 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.894353 4816 generic.go:334] "Generic (PLEG): container finished" podID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" containerID="b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.894403 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" event={"ID":"91573536-f8d4-475f-bfb6-dd2ad9910ce0","Type":"ContainerDied","Data":"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.894717 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" event={"ID":"91573536-f8d4-475f-bfb6-dd2ad9910ce0","Type":"ContainerDied","Data":"26dc3f4ee67e3496e2e4a3275b5368fe7536b3be135627eecf62f5d01c3d1d56"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.894739 4816 scope.go:117] "RemoveContainer" containerID="b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.894434 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.902413 4816 generic.go:334] "Generic (PLEG): container finished" podID="4de6f751-2471-4ce9-a771-00703e7be02a" containerID="f9d1d4fbe1538ba3d5bfad270eb0c25d99d3990476fb2033e0544e3446cf984e" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.902500 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerDied","Data":"f9d1d4fbe1538ba3d5bfad270eb0c25d99d3990476fb2033e0544e3446cf984e"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.903009 4816 scope.go:117] "RemoveContainer" containerID="f9d1d4fbe1538ba3d5bfad270eb0c25d99d3990476fb2033e0544e3446cf984e" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.905173 4816 generic.go:334] "Generic (PLEG): container finished" podID="134459cc-413b-4996-a0c1-aafe8dae8ebb" containerID="5d592ef65f51434a32e7ae05e9ef625d7d9825e6f8d4a65e1163ca86b1e07ca1" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.905230 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerDied","Data":"5d592ef65f51434a32e7ae05e9ef625d7d9825e6f8d4a65e1163ca86b1e07ca1"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.913145 4816 scope.go:117] "RemoveContainer" containerID="5d592ef65f51434a32e7ae05e9ef625d7d9825e6f8d4a65e1163ca86b1e07ca1" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.917801 4816 scope.go:117] "RemoveContainer" containerID="b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3" Mar 16 00:36:34 crc kubenswrapper[4816]: E0316 00:36:34.923680 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3\": container with ID starting with b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3 not found: ID does not exist" containerID="b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.923746 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3"} err="failed to get container status \"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3\": rpc error: code = NotFound desc = could not find container \"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3\": container with ID starting with b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3 not found: ID does not exist" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.925261 4816 generic.go:334] "Generic (PLEG): container finished" podID="a7c7b38e-dd7e-469c-ab38-173944ca2943" containerID="38d2258b8e0ac413732a26743c1c7812f0e79b91c6b5bd0ac251d5ae2c83223c" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.925355 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerDied","Data":"38d2258b8e0ac413732a26743c1c7812f0e79b91c6b5bd0ac251d5ae2c83223c"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.925884 4816 scope.go:117] "RemoveContainer" containerID="38d2258b8e0ac413732a26743c1c7812f0e79b91c6b5bd0ac251d5ae2c83223c" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.928970 4816 generic.go:334] "Generic (PLEG): container finished" podID="71ca3af9-1a2f-4bd2-898a-13c9089b16c2" containerID="ce9e99a9c02686068162a7b07011e1f6804e66a5e36c18c8a59058d037d454f1" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.929065 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerDied","Data":"ce9e99a9c02686068162a7b07011e1f6804e66a5e36c18c8a59058d037d454f1"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.929644 4816 scope.go:117] "RemoveContainer" containerID="ce9e99a9c02686068162a7b07011e1f6804e66a5e36c18c8a59058d037d454f1" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.968177 4816 generic.go:334] "Generic (PLEG): container finished" podID="ee673348-980c-44f5-8e33-71a859ce740c" containerID="4d8e0927fe2232f6e091b4714aa5409aa783badeda2f3341d66e596881120b91" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.968222 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerDied","Data":"4d8e0927fe2232f6e091b4714aa5409aa783badeda2f3341d66e596881120b91"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.968741 4816 scope.go:117] "RemoveContainer" containerID="4d8e0927fe2232f6e091b4714aa5409aa783badeda2f3341d66e596881120b91" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.995610 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.009074 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.638391 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-pdx9k"] Mar 16 00:36:35 crc kubenswrapper[4816]: E0316 00:36:35.638976 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" containerName="default-interconnect" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.638989 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" containerName="default-interconnect" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.639096 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" containerName="default-interconnect" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.639540 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.641688 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.641842 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.641875 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.642013 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.642573 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-j42x2" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.642828 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.643271 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.699103 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" path="/var/lib/kubelet/pods/91573536-f8d4-475f-bfb6-dd2ad9910ce0/volumes" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.699809 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-pdx9k"] Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.771931 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.771975 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9x2x\" (UniqueName: \"kubernetes.io/projected/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-kube-api-access-h9x2x\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.772015 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-config\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.772053 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.772144 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.772180 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.772207 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-users\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873612 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873746 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873785 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-users\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873875 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873925 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9x2x\" (UniqueName: \"kubernetes.io/projected/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-kube-api-access-h9x2x\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873970 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-config\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.874028 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.875370 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-config\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.879051 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.879707 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.879949 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-users\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.880466 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.882069 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.891993 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9x2x\" (UniqueName: \"kubernetes.io/projected/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-kube-api-access-h9x2x\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.956723 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.978531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerStarted","Data":"e016e48a78797dc3b0cf64c1397f7cda09474ea1f3aadd3add5fdda8aa3efae5"} Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.986184 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"71358c543140d46dcc9029d7f9ba4f05810ef31f0e4a20fc51f4ce7cf5f8e9fe"} Mar 16 00:36:36 crc kubenswrapper[4816]: I0316 00:36:36.006878 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"1d9c2a4c897d022060b76111731aab393689e9fa548bd986e485af3c95f78535"} Mar 16 00:36:36 crc kubenswrapper[4816]: I0316 00:36:36.028107 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerStarted","Data":"d197a4e61a58561f29290e7cda9877096c50a36f6c773e38c2083c2bb6fe68f3"} Mar 16 00:36:36 crc kubenswrapper[4816]: I0316 00:36:36.061525 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"9de64e014f9263fa521079f37ae8e5d94487bd630409f3a42736c5916a55fc3d"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.014241 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-pdx9k"] Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.075285 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" event={"ID":"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0","Type":"ContainerStarted","Data":"6447d5e9ac68ceaa250e584e16281615c8ee2affd7076b26c02ac59d7bbb5483"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.079257 4816 generic.go:334] "Generic (PLEG): container finished" podID="a7c7b38e-dd7e-469c-ab38-173944ca2943" containerID="9de64e014f9263fa521079f37ae8e5d94487bd630409f3a42736c5916a55fc3d" exitCode=0 Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.079363 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerDied","Data":"9de64e014f9263fa521079f37ae8e5d94487bd630409f3a42736c5916a55fc3d"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.083681 4816 scope.go:117] "RemoveContainer" containerID="38d2258b8e0ac413732a26743c1c7812f0e79b91c6b5bd0ac251d5ae2c83223c" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.084643 4816 scope.go:117] "RemoveContainer" containerID="9de64e014f9263fa521079f37ae8e5d94487bd630409f3a42736c5916a55fc3d" Mar 16 00:36:37 crc kubenswrapper[4816]: E0316 00:36:37.084951 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v_service-telemetry(a7c7b38e-dd7e-469c-ab38-173944ca2943)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" podUID="a7c7b38e-dd7e-469c-ab38-173944ca2943" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.094358 4816 generic.go:334] "Generic (PLEG): container finished" podID="71ca3af9-1a2f-4bd2-898a-13c9089b16c2" containerID="e016e48a78797dc3b0cf64c1397f7cda09474ea1f3aadd3add5fdda8aa3efae5" exitCode=0 Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.094453 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerDied","Data":"e016e48a78797dc3b0cf64c1397f7cda09474ea1f3aadd3add5fdda8aa3efae5"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.095234 4816 scope.go:117] "RemoveContainer" containerID="e016e48a78797dc3b0cf64c1397f7cda09474ea1f3aadd3add5fdda8aa3efae5" Mar 16 00:36:37 crc kubenswrapper[4816]: E0316 00:36:37.095738 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-598dc6844-7wkmc_service-telemetry(71ca3af9-1a2f-4bd2-898a-13c9089b16c2)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" podUID="71ca3af9-1a2f-4bd2-898a-13c9089b16c2" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.111019 4816 generic.go:334] "Generic (PLEG): container finished" podID="ee673348-980c-44f5-8e33-71a859ce740c" containerID="71358c543140d46dcc9029d7f9ba4f05810ef31f0e4a20fc51f4ce7cf5f8e9fe" exitCode=0 Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.111113 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerDied","Data":"71358c543140d46dcc9029d7f9ba4f05810ef31f0e4a20fc51f4ce7cf5f8e9fe"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.111884 4816 scope.go:117] "RemoveContainer" containerID="71358c543140d46dcc9029d7f9ba4f05810ef31f0e4a20fc51f4ce7cf5f8e9fe" Mar 16 00:36:37 crc kubenswrapper[4816]: E0316 00:36:37.112149 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-lr826_service-telemetry(ee673348-980c-44f5-8e33-71a859ce740c)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" podUID="ee673348-980c-44f5-8e33-71a859ce740c" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.115153 4816 generic.go:334] "Generic (PLEG): container finished" podID="4de6f751-2471-4ce9-a771-00703e7be02a" containerID="1d9c2a4c897d022060b76111731aab393689e9fa548bd986e485af3c95f78535" exitCode=0 Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.115220 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerDied","Data":"1d9c2a4c897d022060b76111731aab393689e9fa548bd986e485af3c95f78535"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.116029 4816 scope.go:117] "RemoveContainer" containerID="1d9c2a4c897d022060b76111731aab393689e9fa548bd986e485af3c95f78535" Mar 16 00:36:37 crc kubenswrapper[4816]: E0316 00:36:37.116319 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t_service-telemetry(4de6f751-2471-4ce9-a771-00703e7be02a)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" podUID="4de6f751-2471-4ce9-a771-00703e7be02a" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.118000 4816 generic.go:334] "Generic (PLEG): container finished" podID="134459cc-413b-4996-a0c1-aafe8dae8ebb" containerID="d197a4e61a58561f29290e7cda9877096c50a36f6c773e38c2083c2bb6fe68f3" exitCode=0 Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.118023 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerDied","Data":"d197a4e61a58561f29290e7cda9877096c50a36f6c773e38c2083c2bb6fe68f3"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.118406 4816 scope.go:117] "RemoveContainer" containerID="d197a4e61a58561f29290e7cda9877096c50a36f6c773e38c2083c2bb6fe68f3" Mar 16 00:36:37 crc kubenswrapper[4816]: E0316 00:36:37.118661 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts_service-telemetry(134459cc-413b-4996-a0c1-aafe8dae8ebb)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" podUID="134459cc-413b-4996-a0c1-aafe8dae8ebb" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.166126 4816 scope.go:117] "RemoveContainer" containerID="ce9e99a9c02686068162a7b07011e1f6804e66a5e36c18c8a59058d037d454f1" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.274686 4816 scope.go:117] "RemoveContainer" containerID="4d8e0927fe2232f6e091b4714aa5409aa783badeda2f3341d66e596881120b91" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.330240 4816 scope.go:117] "RemoveContainer" containerID="f9d1d4fbe1538ba3d5bfad270eb0c25d99d3990476fb2033e0544e3446cf984e" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.378316 4816 scope.go:117] "RemoveContainer" containerID="5d592ef65f51434a32e7ae05e9ef625d7d9825e6f8d4a65e1163ca86b1e07ca1" Mar 16 00:36:38 crc kubenswrapper[4816]: I0316 00:36:38.130804 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" event={"ID":"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0","Type":"ContainerStarted","Data":"9ccc389f0ff36e4f9911d642e3cd6149ad785d77e5645e73cd4748b136fb9338"} Mar 16 00:36:38 crc kubenswrapper[4816]: I0316 00:36:38.155041 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" podStartSLOduration=5.155004348 podStartE2EDuration="5.155004348s" podCreationTimestamp="2026-03-16 00:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:36:38.148469654 +0000 UTC m=+1791.244769607" watchObservedRunningTime="2026-03-16 00:36:38.155004348 +0000 UTC m=+1791.251304301" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.245089 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.246526 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.249148 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.249512 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.255333 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.358975 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/e014dd01-f826-4642-bb43-dbdab4a1e503-qdr-test-config\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.359088 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48w8l\" (UniqueName: \"kubernetes.io/projected/e014dd01-f826-4642-bb43-dbdab4a1e503-kube-api-access-48w8l\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.359203 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/e014dd01-f826-4642-bb43-dbdab4a1e503-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.461590 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/e014dd01-f826-4642-bb43-dbdab4a1e503-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.462051 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/e014dd01-f826-4642-bb43-dbdab4a1e503-qdr-test-config\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.462087 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48w8l\" (UniqueName: \"kubernetes.io/projected/e014dd01-f826-4642-bb43-dbdab4a1e503-kube-api-access-48w8l\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.463006 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/e014dd01-f826-4642-bb43-dbdab4a1e503-qdr-test-config\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.484692 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/e014dd01-f826-4642-bb43-dbdab4a1e503-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.486179 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48w8l\" (UniqueName: \"kubernetes.io/projected/e014dd01-f826-4642-bb43-dbdab4a1e503-kube-api-access-48w8l\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.685543 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 16 00:36:40 crc kubenswrapper[4816]: I0316 00:36:40.106644 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:36:40 crc kubenswrapper[4816]: I0316 00:36:40.154257 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"e014dd01-f826-4642-bb43-dbdab4a1e503","Type":"ContainerStarted","Data":"67c52596879ee5f97550865d1d0d9f1c65faddf356aba2e3cf412b3627b9ccd2"} Mar 16 00:36:44 crc kubenswrapper[4816]: I0316 00:36:44.667559 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:36:44 crc kubenswrapper[4816]: E0316 00:36:44.668214 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:36:48 crc kubenswrapper[4816]: I0316 00:36:48.667389 4816 scope.go:117] "RemoveContainer" containerID="d197a4e61a58561f29290e7cda9877096c50a36f6c773e38c2083c2bb6fe68f3" Mar 16 00:36:48 crc kubenswrapper[4816]: I0316 00:36:48.669357 4816 scope.go:117] "RemoveContainer" containerID="71358c543140d46dcc9029d7f9ba4f05810ef31f0e4a20fc51f4ce7cf5f8e9fe" Mar 16 00:36:50 crc kubenswrapper[4816]: I0316 00:36:50.668200 4816 scope.go:117] "RemoveContainer" containerID="e016e48a78797dc3b0cf64c1397f7cda09474ea1f3aadd3add5fdda8aa3efae5" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.241328 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerStarted","Data":"5f3024deeed51fd8bdc966d9c0ccf9f70fef0634d72805adf14cd3fe5250b028"} Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.243055 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"e014dd01-f826-4642-bb43-dbdab4a1e503","Type":"ContainerStarted","Data":"269fbfe42442cb6eba5484dbb42362b640f52dbb1f8ab10fd7c9f6ce790f6fb9"} Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.249899 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerStarted","Data":"dffc633da63589e27e68b277b5f3d8bc25104b4b4f15449073562fc7d1e7d31e"} Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.256665 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"4664d3b7085c3c707a8f0d1c0767e164bf765d5f82870b12f6105bae899d19e0"} Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.314565 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.121244039 podStartE2EDuration="12.314532807s" podCreationTimestamp="2026-03-16 00:36:39 +0000 UTC" firstStartedPulling="2026-03-16 00:36:40.111355022 +0000 UTC m=+1793.207654975" lastFinishedPulling="2026-03-16 00:36:50.30464379 +0000 UTC m=+1803.400943743" observedRunningTime="2026-03-16 00:36:51.296282322 +0000 UTC m=+1804.392582295" watchObservedRunningTime="2026-03-16 00:36:51.314532807 +0000 UTC m=+1804.410832760" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.591674 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-78t94"] Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.594881 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.604034 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.604307 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.604461 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.604631 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.605990 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.606026 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.636803 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-78t94"] Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.726964 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727033 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727061 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727081 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727139 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727156 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6gg\" (UniqueName: \"kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.828529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.829077 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.829295 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.830175 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.830204 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.830710 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.830845 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.831862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.833213 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.834140 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6gg\" (UniqueName: \"kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.831701 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.834078 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.833166 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.861119 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6gg\" (UniqueName: \"kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.926529 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.002769 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.006014 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.025879 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.036765 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhk6p\" (UniqueName: \"kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p\") pod \"curl\" (UID: \"a5b0de9a-0a4c-468c-b49f-575ecbc053e3\") " pod="service-telemetry/curl" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.139330 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhk6p\" (UniqueName: \"kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p\") pod \"curl\" (UID: \"a5b0de9a-0a4c-468c-b49f-575ecbc053e3\") " pod="service-telemetry/curl" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.160186 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhk6p\" (UniqueName: \"kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p\") pod \"curl\" (UID: \"a5b0de9a-0a4c-468c-b49f-575ecbc053e3\") " pod="service-telemetry/curl" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.360929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:36:52 crc kubenswrapper[4816]: W0316 00:36:52.384987 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfb5a27e_f6df_44ba_8f68_446946410953.slice/crio-b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1 WatchSource:0}: Error finding container b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1: Status 404 returned error can't find the container with id b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1 Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.388030 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-78t94"] Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.574042 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.667260 4816 scope.go:117] "RemoveContainer" containerID="1d9c2a4c897d022060b76111731aab393689e9fa548bd986e485af3c95f78535" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.667601 4816 scope.go:117] "RemoveContainer" containerID="9de64e014f9263fa521079f37ae8e5d94487bd630409f3a42736c5916a55fc3d" Mar 16 00:36:53 crc kubenswrapper[4816]: I0316 00:36:53.286793 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"a5b0de9a-0a4c-468c-b49f-575ecbc053e3","Type":"ContainerStarted","Data":"10214b7e53a9fba954863af7a37aa60f7cfe09a8c0032bf25fa1b27dc6986052"} Mar 16 00:36:53 crc kubenswrapper[4816]: I0316 00:36:53.293872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"c5909de608a6f63e49b5ed75bb4931e802de58428fa60054194935d2f74d6935"} Mar 16 00:36:53 crc kubenswrapper[4816]: I0316 00:36:53.297284 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerStarted","Data":"b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1"} Mar 16 00:36:53 crc kubenswrapper[4816]: I0316 00:36:53.325305 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"f9d46a2edd2a0590c87d3bccb06b155ce8d343cace750651acb690e7067885c4"} Mar 16 00:36:55 crc kubenswrapper[4816]: I0316 00:36:55.344823 4816 generic.go:334] "Generic (PLEG): container finished" podID="a5b0de9a-0a4c-468c-b49f-575ecbc053e3" containerID="69e70052395bd582e905946efddaebf9f7da1e715ed33c4428ae00a7088cbadf" exitCode=0 Mar 16 00:36:55 crc kubenswrapper[4816]: I0316 00:36:55.344920 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"a5b0de9a-0a4c-468c-b49f-575ecbc053e3","Type":"ContainerDied","Data":"69e70052395bd582e905946efddaebf9f7da1e715ed33c4428ae00a7088cbadf"} Mar 16 00:36:58 crc kubenswrapper[4816]: I0316 00:36:58.667454 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:36:58 crc kubenswrapper[4816]: E0316 00:36:58.668179 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:37:01 crc kubenswrapper[4816]: I0316 00:37:01.932512 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.018389 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhk6p\" (UniqueName: \"kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p\") pod \"a5b0de9a-0a4c-468c-b49f-575ecbc053e3\" (UID: \"a5b0de9a-0a4c-468c-b49f-575ecbc053e3\") " Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.023736 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p" (OuterVolumeSpecName: "kube-api-access-vhk6p") pod "a5b0de9a-0a4c-468c-b49f-575ecbc053e3" (UID: "a5b0de9a-0a4c-468c-b49f-575ecbc053e3"). InnerVolumeSpecName "kube-api-access-vhk6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.080320 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_a5b0de9a-0a4c-468c-b49f-575ecbc053e3/curl/0.log" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.120011 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhk6p\" (UniqueName: \"kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.301199 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-2n6hc_5be5ca83-5116-48dc-8d6c-733cbd3e9682/prometheus-webhook-snmp/0.log" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.395514 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"a5b0de9a-0a4c-468c-b49f-575ecbc053e3","Type":"ContainerDied","Data":"10214b7e53a9fba954863af7a37aa60f7cfe09a8c0032bf25fa1b27dc6986052"} Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.395567 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.395577 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10214b7e53a9fba954863af7a37aa60f7cfe09a8c0032bf25fa1b27dc6986052" Mar 16 00:37:04 crc kubenswrapper[4816]: I0316 00:37:04.412385 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerStarted","Data":"dff32730efec159cd6d8cf3ea205ebc1778cc70fb4d2a5f6c005a8a2de2b5ade"} Mar 16 00:37:05 crc kubenswrapper[4816]: I0316 00:37:05.389970 4816 scope.go:117] "RemoveContainer" containerID="a1355d11ec449f6a9fd6597a935b6361539d556da9968192441a1a7760e23960" Mar 16 00:37:11 crc kubenswrapper[4816]: I0316 00:37:11.465129 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerStarted","Data":"694d068f0560de8c0ccec981067fb7a82ff8d6a56fdaddbf1e422c2f99d7ec45"} Mar 16 00:37:11 crc kubenswrapper[4816]: I0316 00:37:11.491791 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-78t94" podStartSLOduration=2.5798966290000003 podStartE2EDuration="20.491771344s" podCreationTimestamp="2026-03-16 00:36:51 +0000 UTC" firstStartedPulling="2026-03-16 00:36:52.391220988 +0000 UTC m=+1805.487520941" lastFinishedPulling="2026-03-16 00:37:10.303095683 +0000 UTC m=+1823.399395656" observedRunningTime="2026-03-16 00:37:11.485752734 +0000 UTC m=+1824.582052687" watchObservedRunningTime="2026-03-16 00:37:11.491771344 +0000 UTC m=+1824.588071297" Mar 16 00:37:12 crc kubenswrapper[4816]: I0316 00:37:12.667571 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:37:12 crc kubenswrapper[4816]: E0316 00:37:12.667853 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:37:23 crc kubenswrapper[4816]: I0316 00:37:23.667672 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:37:23 crc kubenswrapper[4816]: E0316 00:37:23.668475 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:37:32 crc kubenswrapper[4816]: I0316 00:37:32.479746 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-2n6hc_5be5ca83-5116-48dc-8d6c-733cbd3e9682/prometheus-webhook-snmp/0.log" Mar 16 00:37:35 crc kubenswrapper[4816]: I0316 00:37:35.667175 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:37:35 crc kubenswrapper[4816]: E0316 00:37:35.667671 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:37:37 crc kubenswrapper[4816]: I0316 00:37:37.683899 4816 generic.go:334] "Generic (PLEG): container finished" podID="bfb5a27e-f6df-44ba-8f68-446946410953" containerID="dff32730efec159cd6d8cf3ea205ebc1778cc70fb4d2a5f6c005a8a2de2b5ade" exitCode=0 Mar 16 00:37:37 crc kubenswrapper[4816]: I0316 00:37:37.688924 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerDied","Data":"dff32730efec159cd6d8cf3ea205ebc1778cc70fb4d2a5f6c005a8a2de2b5ade"} Mar 16 00:37:37 crc kubenswrapper[4816]: I0316 00:37:37.689908 4816 scope.go:117] "RemoveContainer" containerID="dff32730efec159cd6d8cf3ea205ebc1778cc70fb4d2a5f6c005a8a2de2b5ade" Mar 16 00:37:42 crc kubenswrapper[4816]: I0316 00:37:42.729349 4816 generic.go:334] "Generic (PLEG): container finished" podID="bfb5a27e-f6df-44ba-8f68-446946410953" containerID="694d068f0560de8c0ccec981067fb7a82ff8d6a56fdaddbf1e422c2f99d7ec45" exitCode=0 Mar 16 00:37:42 crc kubenswrapper[4816]: I0316 00:37:42.729400 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerDied","Data":"694d068f0560de8c0ccec981067fb7a82ff8d6a56fdaddbf1e422c2f99d7ec45"} Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.051597 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154086 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154153 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj6gg\" (UniqueName: \"kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154186 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154220 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154245 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154287 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154347 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.176218 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.190732 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg" (OuterVolumeSpecName: "kube-api-access-mj6gg") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "kube-api-access-mj6gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.210980 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.216065 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.224482 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.235911 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.247173 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256269 4816 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256309 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj6gg\" (UniqueName: \"kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256321 4816 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256336 4816 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256349 4816 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256359 4816 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256371 4816 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.746817 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerDied","Data":"b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1"} Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.746863 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.746913 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:37:46 crc kubenswrapper[4816]: I0316 00:37:45.998521 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-78t94_bfb5a27e-f6df-44ba-8f68-446946410953/smoketest-collectd/0.log" Mar 16 00:37:46 crc kubenswrapper[4816]: I0316 00:37:46.303539 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-78t94_bfb5a27e-f6df-44ba-8f68-446946410953/smoketest-ceilometer/0.log" Mar 16 00:37:46 crc kubenswrapper[4816]: I0316 00:37:46.563110 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-pdx9k_99cc3f5f-4a76-4ef9-9001-dda6329b3fe0/default-interconnect/0.log" Mar 16 00:37:46 crc kubenswrapper[4816]: I0316 00:37:46.815425 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t_4de6f751-2471-4ce9-a771-00703e7be02a/bridge/2.log" Mar 16 00:37:47 crc kubenswrapper[4816]: I0316 00:37:47.067469 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t_4de6f751-2471-4ce9-a771-00703e7be02a/sg-core/0.log" Mar 16 00:37:47 crc kubenswrapper[4816]: I0316 00:37:47.346340 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-598dc6844-7wkmc_71ca3af9-1a2f-4bd2-898a-13c9089b16c2/bridge/2.log" Mar 16 00:37:47 crc kubenswrapper[4816]: I0316 00:37:47.611326 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-598dc6844-7wkmc_71ca3af9-1a2f-4bd2-898a-13c9089b16c2/sg-core/0.log" Mar 16 00:37:47 crc kubenswrapper[4816]: I0316 00:37:47.891616 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v_a7c7b38e-dd7e-469c-ab38-173944ca2943/bridge/2.log" Mar 16 00:37:48 crc kubenswrapper[4816]: I0316 00:37:48.183089 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v_a7c7b38e-dd7e-469c-ab38-173944ca2943/sg-core/0.log" Mar 16 00:37:48 crc kubenswrapper[4816]: I0316 00:37:48.465610 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts_134459cc-413b-4996-a0c1-aafe8dae8ebb/bridge/2.log" Mar 16 00:37:48 crc kubenswrapper[4816]: I0316 00:37:48.770145 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts_134459cc-413b-4996-a0c1-aafe8dae8ebb/sg-core/0.log" Mar 16 00:37:49 crc kubenswrapper[4816]: I0316 00:37:49.006794 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-lr826_ee673348-980c-44f5-8e33-71a859ce740c/bridge/2.log" Mar 16 00:37:49 crc kubenswrapper[4816]: I0316 00:37:49.266201 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-lr826_ee673348-980c-44f5-8e33-71a859ce740c/sg-core/0.log" Mar 16 00:37:50 crc kubenswrapper[4816]: I0316 00:37:50.667506 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:37:50 crc kubenswrapper[4816]: E0316 00:37:50.667789 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:37:52 crc kubenswrapper[4816]: I0316 00:37:52.733153 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-55c8479bdf-6m44w_e96079bc-73ba-420e-9568-cea10077c4ae/operator/0.log" Mar 16 00:37:52 crc kubenswrapper[4816]: I0316 00:37:52.976944 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_078376fd-a0f8-4157-8a07-23ce85695dc6/prometheus/0.log" Mar 16 00:37:53 crc kubenswrapper[4816]: I0316 00:37:53.234257 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_819af9fc-6db9-4743-bd06-f844f5ef5b0d/elasticsearch/0.log" Mar 16 00:37:53 crc kubenswrapper[4816]: I0316 00:37:53.492338 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-2n6hc_5be5ca83-5116-48dc-8d6c-733cbd3e9682/prometheus-webhook-snmp/0.log" Mar 16 00:37:53 crc kubenswrapper[4816]: I0316 00:37:53.732772 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_f4698c34-c93e-4d6f-8ab8-2bfcf3118410/alertmanager/0.log" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.150130 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560358-dc9b5"] Mar 16 00:38:00 crc kubenswrapper[4816]: E0316 00:38:00.151106 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-ceilometer" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151126 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-ceilometer" Mar 16 00:38:00 crc kubenswrapper[4816]: E0316 00:38:00.151199 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b0de9a-0a4c-468c-b49f-575ecbc053e3" containerName="curl" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151209 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b0de9a-0a4c-468c-b49f-575ecbc053e3" containerName="curl" Mar 16 00:38:00 crc kubenswrapper[4816]: E0316 00:38:00.151229 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-collectd" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151240 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-collectd" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151416 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-ceilometer" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151437 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-collectd" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151453 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b0de9a-0a4c-468c-b49f-575ecbc053e3" containerName="curl" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.153172 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.156063 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.156103 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.158313 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560358-dc9b5"] Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.163114 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.186619 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvgsn\" (UniqueName: \"kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn\") pod \"auto-csr-approver-29560358-dc9b5\" (UID: \"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1\") " pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.288825 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvgsn\" (UniqueName: \"kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn\") pod \"auto-csr-approver-29560358-dc9b5\" (UID: \"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1\") " pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.307497 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvgsn\" (UniqueName: \"kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn\") pod \"auto-csr-approver-29560358-dc9b5\" (UID: \"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1\") " pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.475795 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.911596 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560358-dc9b5"] Mar 16 00:38:01 crc kubenswrapper[4816]: I0316 00:38:01.904870 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" event={"ID":"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1","Type":"ContainerStarted","Data":"cff1a249957940ab78d95b02deb4fde3f8fbd654950d4bd438fb1337cb77b306"} Mar 16 00:38:02 crc kubenswrapper[4816]: I0316 00:38:02.914830 4816 generic.go:334] "Generic (PLEG): container finished" podID="f515fa39-31b2-47d2-b50d-d7b95a1cf7a1" containerID="a277fc9eef3079702bbb7065b00f7756e2712612c3ceca27a724c44263f7e5ac" exitCode=0 Mar 16 00:38:02 crc kubenswrapper[4816]: I0316 00:38:02.914902 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" event={"ID":"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1","Type":"ContainerDied","Data":"a277fc9eef3079702bbb7065b00f7756e2712612c3ceca27a724c44263f7e5ac"} Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.169176 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.259910 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvgsn\" (UniqueName: \"kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn\") pod \"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1\" (UID: \"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1\") " Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.276338 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn" (OuterVolumeSpecName: "kube-api-access-nvgsn") pod "f515fa39-31b2-47d2-b50d-d7b95a1cf7a1" (UID: "f515fa39-31b2-47d2-b50d-d7b95a1cf7a1"). InnerVolumeSpecName "kube-api-access-nvgsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.361844 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvgsn\" (UniqueName: \"kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn\") on node \"crc\" DevicePath \"\"" Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.667803 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:38:04 crc kubenswrapper[4816]: E0316 00:38:04.668189 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.932582 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" event={"ID":"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1","Type":"ContainerDied","Data":"cff1a249957940ab78d95b02deb4fde3f8fbd654950d4bd438fb1337cb77b306"} Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.932618 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff1a249957940ab78d95b02deb4fde3f8fbd654950d4bd438fb1337cb77b306" Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.932619 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:05 crc kubenswrapper[4816]: I0316 00:38:05.231296 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-4c2cj"] Mar 16 00:38:05 crc kubenswrapper[4816]: I0316 00:38:05.237095 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-4c2cj"] Mar 16 00:38:05 crc kubenswrapper[4816]: I0316 00:38:05.677238 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee5a2cc-4256-43fb-9517-83533a5acf29" path="/var/lib/kubelet/pods/cee5a2cc-4256-43fb-9517-83533a5acf29/volumes" Mar 16 00:38:09 crc kubenswrapper[4816]: I0316 00:38:09.687756 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-7dbcddcc6f-lqrgz_0faefde0-6740-414f-bf47-0d763a35b22f/operator/0.log" Mar 16 00:38:12 crc kubenswrapper[4816]: I0316 00:38:12.998664 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-55c8479bdf-6m44w_e96079bc-73ba-420e-9568-cea10077c4ae/operator/0.log" Mar 16 00:38:13 crc kubenswrapper[4816]: I0316 00:38:13.320307 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_e014dd01-f826-4642-bb43-dbdab4a1e503/qdr/0.log" Mar 16 00:38:15 crc kubenswrapper[4816]: I0316 00:38:15.668335 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:38:15 crc kubenswrapper[4816]: E0316 00:38:15.669138 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:38:29 crc kubenswrapper[4816]: I0316 00:38:29.668206 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:38:29 crc kubenswrapper[4816]: E0316 00:38:29.669441 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.287845 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-75vc8/must-gather-jdjt6"] Mar 16 00:38:39 crc kubenswrapper[4816]: E0316 00:38:39.288771 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f515fa39-31b2-47d2-b50d-d7b95a1cf7a1" containerName="oc" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.288791 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f515fa39-31b2-47d2-b50d-d7b95a1cf7a1" containerName="oc" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.288981 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f515fa39-31b2-47d2-b50d-d7b95a1cf7a1" containerName="oc" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.290065 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.292456 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-75vc8"/"default-dockercfg-8s7pv" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.292654 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-75vc8"/"kube-root-ca.crt" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.292757 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-75vc8"/"openshift-service-ca.crt" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.313865 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x46w\" (UniqueName: \"kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.314210 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.353089 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-75vc8/must-gather-jdjt6"] Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.415683 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.415772 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x46w\" (UniqueName: \"kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.416246 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.435660 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x46w\" (UniqueName: \"kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.606543 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.860206 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-75vc8/must-gather-jdjt6"] Mar 16 00:38:40 crc kubenswrapper[4816]: I0316 00:38:40.251676 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75vc8/must-gather-jdjt6" event={"ID":"09265502-9e41-4783-ad8d-206a0b0372c8","Type":"ContainerStarted","Data":"dea01f9b295d86a7b4b3bc5abc7b94cb38d7e5aae51dde99d686b6fd876042de"} Mar 16 00:38:43 crc kubenswrapper[4816]: I0316 00:38:43.667763 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:38:43 crc kubenswrapper[4816]: E0316 00:38:43.668209 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:38:46 crc kubenswrapper[4816]: I0316 00:38:46.303254 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75vc8/must-gather-jdjt6" event={"ID":"09265502-9e41-4783-ad8d-206a0b0372c8","Type":"ContainerStarted","Data":"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa"} Mar 16 00:38:47 crc kubenswrapper[4816]: I0316 00:38:47.314447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75vc8/must-gather-jdjt6" event={"ID":"09265502-9e41-4783-ad8d-206a0b0372c8","Type":"ContainerStarted","Data":"8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a"} Mar 16 00:38:47 crc kubenswrapper[4816]: I0316 00:38:47.340071 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-75vc8/must-gather-jdjt6" podStartSLOduration=2.183096339 podStartE2EDuration="8.340051869s" podCreationTimestamp="2026-03-16 00:38:39 +0000 UTC" firstStartedPulling="2026-03-16 00:38:39.869782277 +0000 UTC m=+1912.966082230" lastFinishedPulling="2026-03-16 00:38:46.026737807 +0000 UTC m=+1919.123037760" observedRunningTime="2026-03-16 00:38:47.33409119 +0000 UTC m=+1920.430391183" watchObservedRunningTime="2026-03-16 00:38:47.340051869 +0000 UTC m=+1920.436351822" Mar 16 00:38:56 crc kubenswrapper[4816]: I0316 00:38:56.667845 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:38:56 crc kubenswrapper[4816]: E0316 00:38:56.668695 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:05 crc kubenswrapper[4816]: I0316 00:39:05.498047 4816 scope.go:117] "RemoveContainer" containerID="e30d5325e2ed5b1448d06f7c2b9b149b118aecd27db13855743e289187e36f13" Mar 16 00:39:05 crc kubenswrapper[4816]: I0316 00:39:05.542131 4816 scope.go:117] "RemoveContainer" containerID="d67550a0f7d36330df42139700602ca21750bc1c1f58bc1cc3a210d0f86409d3" Mar 16 00:39:05 crc kubenswrapper[4816]: I0316 00:39:05.582594 4816 scope.go:117] "RemoveContainer" containerID="e4439cd35f13a68a04a6b45eaa00f3aeec10afe4bbea233d056d60648e32f1e4" Mar 16 00:39:05 crc kubenswrapper[4816]: I0316 00:39:05.602856 4816 scope.go:117] "RemoveContainer" containerID="520f1908678615d0e5b73ebdbbe6a48ebe9c84b2afda6fc6f03c6d31b9a2fb39" Mar 16 00:39:05 crc kubenswrapper[4816]: I0316 00:39:05.630215 4816 scope.go:117] "RemoveContainer" containerID="2169e8fca36c31b741a4793cc4a50c325f1ec3d6141a69fbe357f1c522080d5b" Mar 16 00:39:08 crc kubenswrapper[4816]: I0316 00:39:08.668524 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:39:08 crc kubenswrapper[4816]: E0316 00:39:08.669127 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:21 crc kubenswrapper[4816]: I0316 00:39:21.668319 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:39:21 crc kubenswrapper[4816]: E0316 00:39:21.669105 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.475451 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.476436 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.491242 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.517541 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2j9c\" (UniqueName: \"kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c\") pod \"infrawatch-operators-jg5tv\" (UID: \"35c26e16-0aa9-4252-995c-d34bd14ed1a9\") " pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.618701 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2j9c\" (UniqueName: \"kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c\") pod \"infrawatch-operators-jg5tv\" (UID: \"35c26e16-0aa9-4252-995c-d34bd14ed1a9\") " pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.656780 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2j9c\" (UniqueName: \"kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c\") pod \"infrawatch-operators-jg5tv\" (UID: \"35c26e16-0aa9-4252-995c-d34bd14ed1a9\") " pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.793062 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:25 crc kubenswrapper[4816]: I0316 00:39:25.037982 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:25 crc kubenswrapper[4816]: I0316 00:39:25.044241 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:39:25 crc kubenswrapper[4816]: I0316 00:39:25.621301 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-jg5tv" event={"ID":"35c26e16-0aa9-4252-995c-d34bd14ed1a9","Type":"ContainerStarted","Data":"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd"} Mar 16 00:39:25 crc kubenswrapper[4816]: I0316 00:39:25.621620 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-jg5tv" event={"ID":"35c26e16-0aa9-4252-995c-d34bd14ed1a9","Type":"ContainerStarted","Data":"5cfc0be3c97d0c2aeedde44df26a4971ceee6aafb9c2f7efbcbc89b8db40d266"} Mar 16 00:39:25 crc kubenswrapper[4816]: I0316 00:39:25.640327 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-jg5tv" podStartSLOduration=1.528942855 podStartE2EDuration="1.640310614s" podCreationTimestamp="2026-03-16 00:39:24 +0000 UTC" firstStartedPulling="2026-03-16 00:39:25.043943713 +0000 UTC m=+1958.140243676" lastFinishedPulling="2026-03-16 00:39:25.155311482 +0000 UTC m=+1958.251611435" observedRunningTime="2026-03-16 00:39:25.635418986 +0000 UTC m=+1958.731718949" watchObservedRunningTime="2026-03-16 00:39:25.640310614 +0000 UTC m=+1958.736610567" Mar 16 00:39:30 crc kubenswrapper[4816]: I0316 00:39:30.506914 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mwhpz_dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a/control-plane-machine-set-operator/0.log" Mar 16 00:39:30 crc kubenswrapper[4816]: I0316 00:39:30.644764 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9xv4p_1306b657-0022-435d-bb72-793f1c1a106b/kube-rbac-proxy/0.log" Mar 16 00:39:30 crc kubenswrapper[4816]: I0316 00:39:30.680501 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9xv4p_1306b657-0022-435d-bb72-793f1c1a106b/machine-api-operator/0.log" Mar 16 00:39:33 crc kubenswrapper[4816]: I0316 00:39:33.668256 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:39:33 crc kubenswrapper[4816]: E0316 00:39:33.668975 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:34 crc kubenswrapper[4816]: I0316 00:39:34.795229 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:34 crc kubenswrapper[4816]: I0316 00:39:34.795276 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:34 crc kubenswrapper[4816]: I0316 00:39:34.838639 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:35 crc kubenswrapper[4816]: I0316 00:39:35.730413 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:35 crc kubenswrapper[4816]: I0316 00:39:35.785687 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:37 crc kubenswrapper[4816]: I0316 00:39:37.713959 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-jg5tv" podUID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" containerName="registry-server" containerID="cri-o://7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd" gracePeriod=2 Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.164502 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.225508 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2j9c\" (UniqueName: \"kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c\") pod \"35c26e16-0aa9-4252-995c-d34bd14ed1a9\" (UID: \"35c26e16-0aa9-4252-995c-d34bd14ed1a9\") " Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.234495 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c" (OuterVolumeSpecName: "kube-api-access-x2j9c") pod "35c26e16-0aa9-4252-995c-d34bd14ed1a9" (UID: "35c26e16-0aa9-4252-995c-d34bd14ed1a9"). InnerVolumeSpecName "kube-api-access-x2j9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.327281 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2j9c\" (UniqueName: \"kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c\") on node \"crc\" DevicePath \"\"" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.722927 4816 generic.go:334] "Generic (PLEG): container finished" podID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" containerID="7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd" exitCode=0 Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.722967 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-jg5tv" event={"ID":"35c26e16-0aa9-4252-995c-d34bd14ed1a9","Type":"ContainerDied","Data":"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd"} Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.722996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-jg5tv" event={"ID":"35c26e16-0aa9-4252-995c-d34bd14ed1a9","Type":"ContainerDied","Data":"5cfc0be3c97d0c2aeedde44df26a4971ceee6aafb9c2f7efbcbc89b8db40d266"} Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.723016 4816 scope.go:117] "RemoveContainer" containerID="7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.723028 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.748201 4816 scope.go:117] "RemoveContainer" containerID="7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd" Mar 16 00:39:38 crc kubenswrapper[4816]: E0316 00:39:38.748748 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd\": container with ID starting with 7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd not found: ID does not exist" containerID="7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.748784 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd"} err="failed to get container status \"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd\": rpc error: code = NotFound desc = could not find container \"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd\": container with ID starting with 7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd not found: ID does not exist" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.758718 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.767852 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:39 crc kubenswrapper[4816]: I0316 00:39:39.676869 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" path="/var/lib/kubelet/pods/35c26e16-0aa9-4252-995c-d34bd14ed1a9/volumes" Mar 16 00:39:44 crc kubenswrapper[4816]: I0316 00:39:44.145516 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-9q9nz_88d51e1b-a795-4157-82b4-8a74d228e698/cert-manager-controller/0.log" Mar 16 00:39:44 crc kubenswrapper[4816]: I0316 00:39:44.294993 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-25jvg_fe81d263-aafd-4bdb-a088-d4bc52592a2d/cert-manager-cainjector/0.log" Mar 16 00:39:44 crc kubenswrapper[4816]: I0316 00:39:44.379049 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-ssr4q_ca67da37-05ff-4b13-aeea-04ac7f17ffc0/cert-manager-webhook/0.log" Mar 16 00:39:44 crc kubenswrapper[4816]: I0316 00:39:44.667977 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:39:44 crc kubenswrapper[4816]: E0316 00:39:44.668237 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:56 crc kubenswrapper[4816]: I0316 00:39:56.667627 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:39:56 crc kubenswrapper[4816]: E0316 00:39:56.668583 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:59 crc kubenswrapper[4816]: I0316 00:39:59.697198 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-tfv44_562f24fe-5c4c-4540-96ae-6e01f539141b/prometheus-operator/0.log" Mar 16 00:39:59 crc kubenswrapper[4816]: I0316 00:39:59.866625 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk_9a808114-3164-4abe-a481-1b5d3b9df2a0/prometheus-operator-admission-webhook/0.log" Mar 16 00:39:59 crc kubenswrapper[4816]: I0316 00:39:59.917104 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn_36951342-3370-4291-baa3-2612f64036fd/prometheus-operator-admission-webhook/0.log" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.052990 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-w6wv7_8d0f60fa-8d26-43ea-a680-1d3a92dd270d/operator/0.log" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.086238 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-t7w7m_f24959c1-f57f-4bf6-8a55-c8a35173ff8b/perses-operator/0.log" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.133656 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560360-m8l8g"] Mar 16 00:40:00 crc kubenswrapper[4816]: E0316 00:40:00.133892 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" containerName="registry-server" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.133904 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" containerName="registry-server" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.134022 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" containerName="registry-server" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.134423 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.139890 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.140022 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.140087 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.147147 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560360-m8l8g"] Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.281750 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phg76\" (UniqueName: \"kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76\") pod \"auto-csr-approver-29560360-m8l8g\" (UID: \"e737de26-7c67-4557-9e7b-67fd7e9d835b\") " pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.383677 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phg76\" (UniqueName: \"kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76\") pod \"auto-csr-approver-29560360-m8l8g\" (UID: \"e737de26-7c67-4557-9e7b-67fd7e9d835b\") " pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.405533 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phg76\" (UniqueName: \"kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76\") pod \"auto-csr-approver-29560360-m8l8g\" (UID: \"e737de26-7c67-4557-9e7b-67fd7e9d835b\") " pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.481703 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.896516 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560360-m8l8g"] Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.907998 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" event={"ID":"e737de26-7c67-4557-9e7b-67fd7e9d835b","Type":"ContainerStarted","Data":"b81fe24230d2398c34926e1bf3e9257add7500c9b552f10b835db83e285f4bdc"} Mar 16 00:40:02 crc kubenswrapper[4816]: I0316 00:40:02.930165 4816 generic.go:334] "Generic (PLEG): container finished" podID="e737de26-7c67-4557-9e7b-67fd7e9d835b" containerID="965ec4e3b522a6a1ef0e1631c35ba6e14d08dbcd684be23e71cc56af8a75f39e" exitCode=0 Mar 16 00:40:02 crc kubenswrapper[4816]: I0316 00:40:02.930451 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" event={"ID":"e737de26-7c67-4557-9e7b-67fd7e9d835b","Type":"ContainerDied","Data":"965ec4e3b522a6a1ef0e1631c35ba6e14d08dbcd684be23e71cc56af8a75f39e"} Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.188034 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.236837 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phg76\" (UniqueName: \"kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76\") pod \"e737de26-7c67-4557-9e7b-67fd7e9d835b\" (UID: \"e737de26-7c67-4557-9e7b-67fd7e9d835b\") " Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.242320 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76" (OuterVolumeSpecName: "kube-api-access-phg76") pod "e737de26-7c67-4557-9e7b-67fd7e9d835b" (UID: "e737de26-7c67-4557-9e7b-67fd7e9d835b"). InnerVolumeSpecName "kube-api-access-phg76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.338921 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phg76\" (UniqueName: \"kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.943917 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" event={"ID":"e737de26-7c67-4557-9e7b-67fd7e9d835b","Type":"ContainerDied","Data":"b81fe24230d2398c34926e1bf3e9257add7500c9b552f10b835db83e285f4bdc"} Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.943962 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b81fe24230d2398c34926e1bf3e9257add7500c9b552f10b835db83e285f4bdc" Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.944025 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:05 crc kubenswrapper[4816]: I0316 00:40:05.250245 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-nmflm"] Mar 16 00:40:05 crc kubenswrapper[4816]: I0316 00:40:05.256660 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-nmflm"] Mar 16 00:40:05 crc kubenswrapper[4816]: I0316 00:40:05.684340 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4786aa78-4870-43d7-a324-e3e3dd2c7943" path="/var/lib/kubelet/pods/4786aa78-4870-43d7-a324-e3e3dd2c7943/volumes" Mar 16 00:40:07 crc kubenswrapper[4816]: I0316 00:40:07.671947 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:40:07 crc kubenswrapper[4816]: E0316 00:40:07.672196 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.879479 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:12 crc kubenswrapper[4816]: E0316 00:40:12.879962 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e737de26-7c67-4557-9e7b-67fd7e9d835b" containerName="oc" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.879973 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e737de26-7c67-4557-9e7b-67fd7e9d835b" containerName="oc" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.880098 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e737de26-7c67-4557-9e7b-67fd7e9d835b" containerName="oc" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.880908 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.901908 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.964178 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.964250 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sndqw\" (UniqueName: \"kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.964300 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.065267 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.065519 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sndqw\" (UniqueName: \"kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.065647 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.066156 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.066426 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.085236 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sndqw\" (UniqueName: \"kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.255366 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.585607 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.880726 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rr89x"] Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.882598 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.897367 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rr89x"] Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.993406 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-catalog-content\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.993453 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-utilities\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.993474 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d67l\" (UniqueName: \"kubernetes.io/projected/c859eca9-ca62-4c8a-a11d-dac23fafcec5-kube-api-access-9d67l\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.059385 4816 generic.go:334] "Generic (PLEG): container finished" podID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerID="0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67" exitCode=0 Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.059484 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerDied","Data":"0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67"} Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.059750 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerStarted","Data":"7ee26dd0ee79cd708284dffb1050308eefe558df23e3cb8fc337e2dbccfbdb55"} Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.095098 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-catalog-content\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.095150 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-utilities\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.095187 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d67l\" (UniqueName: \"kubernetes.io/projected/c859eca9-ca62-4c8a-a11d-dac23fafcec5-kube-api-access-9d67l\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.095904 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-utilities\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.096241 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-catalog-content\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.118865 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d67l\" (UniqueName: \"kubernetes.io/projected/c859eca9-ca62-4c8a-a11d-dac23fafcec5-kube-api-access-9d67l\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.205488 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.762550 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rr89x"] Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.036817 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/util/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.067430 4816 generic.go:334] "Generic (PLEG): container finished" podID="c859eca9-ca62-4c8a-a11d-dac23fafcec5" containerID="8f93df4c6d7e443e6258a1faadd00d501b3f70b30052f1330cfb00dfaabbe7d6" exitCode=0 Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.067490 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rr89x" event={"ID":"c859eca9-ca62-4c8a-a11d-dac23fafcec5","Type":"ContainerDied","Data":"8f93df4c6d7e443e6258a1faadd00d501b3f70b30052f1330cfb00dfaabbe7d6"} Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.067517 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rr89x" event={"ID":"c859eca9-ca62-4c8a-a11d-dac23fafcec5","Type":"ContainerStarted","Data":"725b71ee7bad24f50af2dd8bfdd56c09f41a8531b097fc153f7ad1c3c22a44fd"} Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.069542 4816 generic.go:334] "Generic (PLEG): container finished" podID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerID="574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb" exitCode=0 Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.069593 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerDied","Data":"574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb"} Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.197621 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/pull/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.227696 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/util/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.239459 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/pull/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.383094 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/pull/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.386610 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/util/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.401380 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/extract/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.595095 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/util/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.751233 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/pull/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.764618 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/util/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.783101 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/pull/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.012109 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/pull/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.028440 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/extract/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.048118 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.079059 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerStarted","Data":"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795"} Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.097401 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vn8mx" podStartSLOduration=3.623918883 podStartE2EDuration="5.097386362s" podCreationTimestamp="2026-03-16 00:40:12 +0000 UTC" firstStartedPulling="2026-03-16 00:40:15.061109 +0000 UTC m=+2008.157408953" lastFinishedPulling="2026-03-16 00:40:16.534576469 +0000 UTC m=+2009.630876432" observedRunningTime="2026-03-16 00:40:17.095313573 +0000 UTC m=+2010.191613526" watchObservedRunningTime="2026-03-16 00:40:17.097386362 +0000 UTC m=+2010.193686315" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.238104 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.396691 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.423579 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/pull/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.431157 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/pull/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.586286 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.614707 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/extract/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.653193 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/pull/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.795898 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.928178 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.990652 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/pull/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.054577 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/pull/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.196499 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/extract/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.248737 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/util/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.250611 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/pull/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.387914 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-utilities/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.624982 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-content/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.632108 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-content/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.649317 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-utilities/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.829662 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-utilities/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.866064 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-content/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.017958 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rr89x_c859eca9-ca62-4c8a-a11d-dac23fafcec5/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.149080 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/registry-server/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.199974 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rr89x_c859eca9-ca62-4c8a-a11d-dac23fafcec5/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.399543 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rr89x_c859eca9-ca62-4c8a-a11d-dac23fafcec5/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.505173 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.642049 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-content/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.646487 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.671158 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-content/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.822609 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-content/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.900085 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.920917 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.123038 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.149898 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-content/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.162072 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-content/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.338412 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/registry-server/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.357848 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-content/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.394238 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/registry-server/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.407181 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.552725 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.570191 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8ln7g_6d197f63-0b7c-496d-89bb-9cd70933969a/marketplace-operator/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.736597 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-content/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.736850 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.757202 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-content/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.917981 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.939068 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-content/0.log" Mar 16 00:40:21 crc kubenswrapper[4816]: I0316 00:40:21.213110 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/registry-server/0.log" Mar 16 00:40:21 crc kubenswrapper[4816]: I0316 00:40:21.667620 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:40:21 crc kubenswrapper[4816]: E0316 00:40:21.667854 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:40:23 crc kubenswrapper[4816]: I0316 00:40:23.127696 4816 generic.go:334] "Generic (PLEG): container finished" podID="c859eca9-ca62-4c8a-a11d-dac23fafcec5" containerID="5f2c0db44c076ad7e9a7e236b9040381aff8d488da201ce9ed1de55f218946d9" exitCode=0 Mar 16 00:40:23 crc kubenswrapper[4816]: I0316 00:40:23.127835 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rr89x" event={"ID":"c859eca9-ca62-4c8a-a11d-dac23fafcec5","Type":"ContainerDied","Data":"5f2c0db44c076ad7e9a7e236b9040381aff8d488da201ce9ed1de55f218946d9"} Mar 16 00:40:23 crc kubenswrapper[4816]: I0316 00:40:23.255776 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:23 crc kubenswrapper[4816]: I0316 00:40:23.255865 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:23 crc kubenswrapper[4816]: I0316 00:40:23.322700 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:24 crc kubenswrapper[4816]: I0316 00:40:24.138152 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rr89x" event={"ID":"c859eca9-ca62-4c8a-a11d-dac23fafcec5","Type":"ContainerStarted","Data":"6d6395fd6fd2c13b0749094b0edc67b382632e3df50267fec6f9cc1310773f3c"} Mar 16 00:40:24 crc kubenswrapper[4816]: I0316 00:40:24.167145 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rr89x" podStartSLOduration=2.720901708 podStartE2EDuration="10.167120569s" podCreationTimestamp="2026-03-16 00:40:14 +0000 UTC" firstStartedPulling="2026-03-16 00:40:16.06950614 +0000 UTC m=+2009.165806093" lastFinishedPulling="2026-03-16 00:40:23.515724961 +0000 UTC m=+2016.612024954" observedRunningTime="2026-03-16 00:40:24.159098262 +0000 UTC m=+2017.255398225" watchObservedRunningTime="2026-03-16 00:40:24.167120569 +0000 UTC m=+2017.263420522" Mar 16 00:40:24 crc kubenswrapper[4816]: I0316 00:40:24.202454 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:25 crc kubenswrapper[4816]: I0316 00:40:25.205937 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:25 crc kubenswrapper[4816]: I0316 00:40:25.206180 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:25 crc kubenswrapper[4816]: I0316 00:40:25.373464 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.155797 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vn8mx" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="registry-server" containerID="cri-o://5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795" gracePeriod=2 Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.260029 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rr89x" podUID="c859eca9-ca62-4c8a-a11d-dac23fafcec5" containerName="registry-server" probeResult="failure" output=< Mar 16 00:40:26 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:40:26 crc kubenswrapper[4816]: > Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.526327 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.671804 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sndqw\" (UniqueName: \"kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw\") pod \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.671943 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content\") pod \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.672024 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities\") pod \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.672742 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities" (OuterVolumeSpecName: "utilities") pod "4f54343a-e5a5-4a0f-9940-5dabdbea5927" (UID: "4f54343a-e5a5-4a0f-9940-5dabdbea5927"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.677685 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw" (OuterVolumeSpecName: "kube-api-access-sndqw") pod "4f54343a-e5a5-4a0f-9940-5dabdbea5927" (UID: "4f54343a-e5a5-4a0f-9940-5dabdbea5927"). InnerVolumeSpecName "kube-api-access-sndqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.720992 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f54343a-e5a5-4a0f-9940-5dabdbea5927" (UID: "4f54343a-e5a5-4a0f-9940-5dabdbea5927"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.773405 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.773447 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.773460 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sndqw\" (UniqueName: \"kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.172547 4816 generic.go:334] "Generic (PLEG): container finished" podID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerID="5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795" exitCode=0 Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.172747 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerDied","Data":"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795"} Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.172993 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerDied","Data":"7ee26dd0ee79cd708284dffb1050308eefe558df23e3cb8fc337e2dbccfbdb55"} Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.172864 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.173092 4816 scope.go:117] "RemoveContainer" containerID="5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.195768 4816 scope.go:117] "RemoveContainer" containerID="574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.205306 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.212544 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.214669 4816 scope.go:117] "RemoveContainer" containerID="0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.243309 4816 scope.go:117] "RemoveContainer" containerID="5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795" Mar 16 00:40:27 crc kubenswrapper[4816]: E0316 00:40:27.244016 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795\": container with ID starting with 5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795 not found: ID does not exist" containerID="5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.244059 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795"} err="failed to get container status \"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795\": rpc error: code = NotFound desc = could not find container \"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795\": container with ID starting with 5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795 not found: ID does not exist" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.244088 4816 scope.go:117] "RemoveContainer" containerID="574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb" Mar 16 00:40:27 crc kubenswrapper[4816]: E0316 00:40:27.244494 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb\": container with ID starting with 574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb not found: ID does not exist" containerID="574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.244534 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb"} err="failed to get container status \"574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb\": rpc error: code = NotFound desc = could not find container \"574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb\": container with ID starting with 574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb not found: ID does not exist" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.244578 4816 scope.go:117] "RemoveContainer" containerID="0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67" Mar 16 00:40:27 crc kubenswrapper[4816]: E0316 00:40:27.244856 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67\": container with ID starting with 0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67 not found: ID does not exist" containerID="0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.244897 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67"} err="failed to get container status \"0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67\": rpc error: code = NotFound desc = could not find container \"0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67\": container with ID starting with 0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67 not found: ID does not exist" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.686515 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" path="/var/lib/kubelet/pods/4f54343a-e5a5-4a0f-9940-5dabdbea5927/volumes" Mar 16 00:40:32 crc kubenswrapper[4816]: I0316 00:40:32.668044 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:40:32 crc kubenswrapper[4816]: E0316 00:40:32.668495 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:40:34 crc kubenswrapper[4816]: I0316 00:40:34.190124 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk_9a808114-3164-4abe-a481-1b5d3b9df2a0/prometheus-operator-admission-webhook/0.log" Mar 16 00:40:34 crc kubenswrapper[4816]: I0316 00:40:34.216327 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn_36951342-3370-4291-baa3-2612f64036fd/prometheus-operator-admission-webhook/0.log" Mar 16 00:40:34 crc kubenswrapper[4816]: I0316 00:40:34.220477 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-tfv44_562f24fe-5c4c-4540-96ae-6e01f539141b/prometheus-operator/0.log" Mar 16 00:40:34 crc kubenswrapper[4816]: I0316 00:40:34.321188 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-w6wv7_8d0f60fa-8d26-43ea-a680-1d3a92dd270d/operator/0.log" Mar 16 00:40:34 crc kubenswrapper[4816]: I0316 00:40:34.378610 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-t7w7m_f24959c1-f57f-4bf6-8a55-c8a35173ff8b/perses-operator/0.log" Mar 16 00:40:35 crc kubenswrapper[4816]: I0316 00:40:35.261672 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:35 crc kubenswrapper[4816]: I0316 00:40:35.317203 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:35 crc kubenswrapper[4816]: I0316 00:40:35.392283 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rr89x"] Mar 16 00:40:35 crc kubenswrapper[4816]: I0316 00:40:35.492697 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:40:35 crc kubenswrapper[4816]: I0316 00:40:35.492976 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6z2gx" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="registry-server" containerID="cri-o://700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15" gracePeriod=2 Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.233227 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.245229 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerID="700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15" exitCode=0 Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.245278 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerDied","Data":"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15"} Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.245320 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.245347 4816 scope.go:117] "RemoveContainer" containerID="700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.245333 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerDied","Data":"d6634923f047727775df02d4d821820bfe16c08bbd5f740d3677d67d9b993223"} Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.268580 4816 scope.go:117] "RemoveContainer" containerID="083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.286833 4816 scope.go:117] "RemoveContainer" containerID="5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.312015 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content\") pod \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.312085 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities\") pod \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.312163 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pfdf\" (UniqueName: \"kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf\") pod \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.313058 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities" (OuterVolumeSpecName: "utilities") pod "9d1b1f79-de52-4ade-9a72-69b86c55e8ff" (UID: "9d1b1f79-de52-4ade-9a72-69b86c55e8ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.318806 4816 scope.go:117] "RemoveContainer" containerID="700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.323056 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf" (OuterVolumeSpecName: "kube-api-access-8pfdf") pod "9d1b1f79-de52-4ade-9a72-69b86c55e8ff" (UID: "9d1b1f79-de52-4ade-9a72-69b86c55e8ff"). InnerVolumeSpecName "kube-api-access-8pfdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:40:36 crc kubenswrapper[4816]: E0316 00:40:36.326731 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15\": container with ID starting with 700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15 not found: ID does not exist" containerID="700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.326791 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15"} err="failed to get container status \"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15\": rpc error: code = NotFound desc = could not find container \"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15\": container with ID starting with 700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15 not found: ID does not exist" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.326820 4816 scope.go:117] "RemoveContainer" containerID="083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce" Mar 16 00:40:36 crc kubenswrapper[4816]: E0316 00:40:36.327392 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce\": container with ID starting with 083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce not found: ID does not exist" containerID="083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.327434 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce"} err="failed to get container status \"083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce\": rpc error: code = NotFound desc = could not find container \"083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce\": container with ID starting with 083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce not found: ID does not exist" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.327462 4816 scope.go:117] "RemoveContainer" containerID="5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005" Mar 16 00:40:36 crc kubenswrapper[4816]: E0316 00:40:36.327717 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005\": container with ID starting with 5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005 not found: ID does not exist" containerID="5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.327748 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005"} err="failed to get container status \"5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005\": rpc error: code = NotFound desc = could not find container \"5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005\": container with ID starting with 5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005 not found: ID does not exist" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.361118 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d1b1f79-de52-4ade-9a72-69b86c55e8ff" (UID: "9d1b1f79-de52-4ade-9a72-69b86c55e8ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.413284 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pfdf\" (UniqueName: \"kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.413324 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.413333 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.577205 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.582534 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:40:37 crc kubenswrapper[4816]: I0316 00:40:37.677096 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" path="/var/lib/kubelet/pods/9d1b1f79-de52-4ade-9a72-69b86c55e8ff/volumes" Mar 16 00:40:45 crc kubenswrapper[4816]: I0316 00:40:45.668333 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:40:45 crc kubenswrapper[4816]: E0316 00:40:45.669434 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:40:57 crc kubenswrapper[4816]: I0316 00:40:57.686959 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:40:57 crc kubenswrapper[4816]: E0316 00:40:57.687991 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:41:05 crc kubenswrapper[4816]: I0316 00:41:05.735754 4816 scope.go:117] "RemoveContainer" containerID="0c26c66eab197680871c2539e7ed1477694cb8e32e0bc0cdad1221a9720899f7" Mar 16 00:41:09 crc kubenswrapper[4816]: I0316 00:41:09.667804 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:41:09 crc kubenswrapper[4816]: E0316 00:41:09.668688 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:41:20 crc kubenswrapper[4816]: I0316 00:41:20.668200 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:41:20 crc kubenswrapper[4816]: E0316 00:41:20.669659 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:41:26 crc kubenswrapper[4816]: I0316 00:41:26.832546 4816 generic.go:334] "Generic (PLEG): container finished" podID="09265502-9e41-4783-ad8d-206a0b0372c8" containerID="494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa" exitCode=0 Mar 16 00:41:26 crc kubenswrapper[4816]: I0316 00:41:26.832590 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75vc8/must-gather-jdjt6" event={"ID":"09265502-9e41-4783-ad8d-206a0b0372c8","Type":"ContainerDied","Data":"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa"} Mar 16 00:41:26 crc kubenswrapper[4816]: I0316 00:41:26.833612 4816 scope.go:117] "RemoveContainer" containerID="494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa" Mar 16 00:41:27 crc kubenswrapper[4816]: I0316 00:41:27.100828 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-75vc8_must-gather-jdjt6_09265502-9e41-4783-ad8d-206a0b0372c8/gather/0.log" Mar 16 00:41:33 crc kubenswrapper[4816]: I0316 00:41:33.668972 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:41:33 crc kubenswrapper[4816]: I0316 00:41:33.902565 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"08e6183680abbd0a1de1120a16451b7c7144bdc2fd91a97f582aa7293556cc2e"} Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.119202 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-75vc8/must-gather-jdjt6"] Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.119737 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-75vc8/must-gather-jdjt6" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="copy" containerID="cri-o://8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a" gracePeriod=2 Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.126031 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-75vc8/must-gather-jdjt6"] Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.467128 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-75vc8_must-gather-jdjt6_09265502-9e41-4783-ad8d-206a0b0372c8/copy/0.log" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.467858 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.575476 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x46w\" (UniqueName: \"kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w\") pod \"09265502-9e41-4783-ad8d-206a0b0372c8\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.575538 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output\") pod \"09265502-9e41-4783-ad8d-206a0b0372c8\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.580853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w" (OuterVolumeSpecName: "kube-api-access-4x46w") pod "09265502-9e41-4783-ad8d-206a0b0372c8" (UID: "09265502-9e41-4783-ad8d-206a0b0372c8"). InnerVolumeSpecName "kube-api-access-4x46w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.638248 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "09265502-9e41-4783-ad8d-206a0b0372c8" (UID: "09265502-9e41-4783-ad8d-206a0b0372c8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.676948 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x46w\" (UniqueName: \"kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w\") on node \"crc\" DevicePath \"\"" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.676988 4816 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.912236 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-75vc8_must-gather-jdjt6_09265502-9e41-4783-ad8d-206a0b0372c8/copy/0.log" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.912995 4816 generic.go:334] "Generic (PLEG): container finished" podID="09265502-9e41-4783-ad8d-206a0b0372c8" containerID="8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a" exitCode=143 Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.913047 4816 scope.go:117] "RemoveContainer" containerID="8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.913073 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.974361 4816 scope.go:117] "RemoveContainer" containerID="494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa" Mar 16 00:41:35 crc kubenswrapper[4816]: I0316 00:41:35.009600 4816 scope.go:117] "RemoveContainer" containerID="8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a" Mar 16 00:41:35 crc kubenswrapper[4816]: E0316 00:41:35.010095 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a\": container with ID starting with 8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a not found: ID does not exist" containerID="8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a" Mar 16 00:41:35 crc kubenswrapper[4816]: I0316 00:41:35.010173 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a"} err="failed to get container status \"8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a\": rpc error: code = NotFound desc = could not find container \"8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a\": container with ID starting with 8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a not found: ID does not exist" Mar 16 00:41:35 crc kubenswrapper[4816]: I0316 00:41:35.010209 4816 scope.go:117] "RemoveContainer" containerID="494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa" Mar 16 00:41:35 crc kubenswrapper[4816]: E0316 00:41:35.010628 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa\": container with ID starting with 494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa not found: ID does not exist" containerID="494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa" Mar 16 00:41:35 crc kubenswrapper[4816]: I0316 00:41:35.010659 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa"} err="failed to get container status \"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa\": rpc error: code = NotFound desc = could not find container \"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa\": container with ID starting with 494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa not found: ID does not exist" Mar 16 00:41:35 crc kubenswrapper[4816]: I0316 00:41:35.675222 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" path="/var/lib/kubelet/pods/09265502-9e41-4783-ad8d-206a0b0372c8/volumes" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.166816 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560362-87zl7"] Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167833 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="extract-utilities" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.167856 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="extract-utilities" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167876 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="copy" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.167893 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="copy" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167907 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="extract-content" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.167919 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="extract-content" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167935 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.167947 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167969 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="extract-content" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.167981 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="extract-content" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167994 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168005 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.168029 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="extract-utilities" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168041 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="extract-utilities" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.168064 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="gather" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168077 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="gather" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168294 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168316 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="gather" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168333 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168351 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="copy" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.169082 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.172812 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.172917 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.172996 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.180430 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560362-87zl7"] Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.360774 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tlfc\" (UniqueName: \"kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc\") pod \"auto-csr-approver-29560362-87zl7\" (UID: \"31ff5b41-45f3-4978-aed9-4ea3cf4d6736\") " pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.469065 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tlfc\" (UniqueName: \"kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc\") pod \"auto-csr-approver-29560362-87zl7\" (UID: \"31ff5b41-45f3-4978-aed9-4ea3cf4d6736\") " pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.502676 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tlfc\" (UniqueName: \"kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc\") pod \"auto-csr-approver-29560362-87zl7\" (UID: \"31ff5b41-45f3-4978-aed9-4ea3cf4d6736\") " pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.802343 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:01 crc kubenswrapper[4816]: I0316 00:42:01.044167 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560362-87zl7"] Mar 16 00:42:01 crc kubenswrapper[4816]: I0316 00:42:01.170115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560362-87zl7" event={"ID":"31ff5b41-45f3-4978-aed9-4ea3cf4d6736","Type":"ContainerStarted","Data":"3b9aacb92328011ed3e7b23460f31c960e43bddb1bb6b0ed2e7b1fde4055ad1a"} Mar 16 00:42:03 crc kubenswrapper[4816]: I0316 00:42:03.195270 4816 generic.go:334] "Generic (PLEG): container finished" podID="31ff5b41-45f3-4978-aed9-4ea3cf4d6736" containerID="9c3a48820610a85f7b8e12ac2256cf65b198b7f676046e8539e7967f030e28c5" exitCode=0 Mar 16 00:42:03 crc kubenswrapper[4816]: I0316 00:42:03.195349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560362-87zl7" event={"ID":"31ff5b41-45f3-4978-aed9-4ea3cf4d6736","Type":"ContainerDied","Data":"9c3a48820610a85f7b8e12ac2256cf65b198b7f676046e8539e7967f030e28c5"} Mar 16 00:42:04 crc kubenswrapper[4816]: I0316 00:42:04.482532 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:04 crc kubenswrapper[4816]: I0316 00:42:04.636115 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tlfc\" (UniqueName: \"kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc\") pod \"31ff5b41-45f3-4978-aed9-4ea3cf4d6736\" (UID: \"31ff5b41-45f3-4978-aed9-4ea3cf4d6736\") " Mar 16 00:42:04 crc kubenswrapper[4816]: I0316 00:42:04.645049 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc" (OuterVolumeSpecName: "kube-api-access-7tlfc") pod "31ff5b41-45f3-4978-aed9-4ea3cf4d6736" (UID: "31ff5b41-45f3-4978-aed9-4ea3cf4d6736"). InnerVolumeSpecName "kube-api-access-7tlfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:42:04 crc kubenswrapper[4816]: I0316 00:42:04.738195 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tlfc\" (UniqueName: \"kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc\") on node \"crc\" DevicePath \"\"" Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.216279 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560362-87zl7" event={"ID":"31ff5b41-45f3-4978-aed9-4ea3cf4d6736","Type":"ContainerDied","Data":"3b9aacb92328011ed3e7b23460f31c960e43bddb1bb6b0ed2e7b1fde4055ad1a"} Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.216317 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9aacb92328011ed3e7b23460f31c960e43bddb1bb6b0ed2e7b1fde4055ad1a" Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.216383 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.565128 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-vl86r"] Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.568658 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-vl86r"] Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.682002 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" path="/var/lib/kubelet/pods/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb/volumes" Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.868956 4816 scope.go:117] "RemoveContainer" containerID="a31a7a7d2a1fafc20c9ab619317cd87262faab560369319826f6c184261023c4" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.470441 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2pd6k"] Mar 16 00:43:38 crc kubenswrapper[4816]: E0316 00:43:38.471193 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ff5b41-45f3-4978-aed9-4ea3cf4d6736" containerName="oc" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.471206 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ff5b41-45f3-4978-aed9-4ea3cf4d6736" containerName="oc" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.471342 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="31ff5b41-45f3-4978-aed9-4ea3cf4d6736" containerName="oc" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.472189 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.482409 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pd6k"] Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.573820 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e44f2fd2-82cf-4057-a3f1-eed70204d566-catalog-content\") pod \"redhat-operators-2pd6k\" (UID: \"e44f2fd2-82cf-4057-a3f1-eed70204d566\") " pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.573901 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjqz\" (UniqueName: \"kubernetes.io/projected/e44f2fd2-82cf-4057-a3f1-eed70204d566-kube-api-access-tsjqz\") pod \"redhat-operators-2pd6k\" (UID: \"e44f2fd2-82cf-4057-a3f1-eed70204d566\") " pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.573945 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e44f2fd2-82cf-4057-a3f1-eed70204d566-utilities\") pod \"redhat-operators-2pd6k\" (UID: \"e44f2fd2-82cf-4057-a3f1-eed70204d566\") " pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.675154 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjqz\" (UniqueName: \"kubernetes.io/projected/e44f2fd2-82cf-4057-a3f1-eed70204d566-kube-api-access-tsjqz\") pod \"redhat-operators-2pd6k\" (UID: \"e44f2fd2-82cf-4057-a3f1-eed70204d566\") " pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.675196 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e44f2fd2-82cf-4057-a3f1-eed70204d566-utilities\") pod \"redhat-operators-2pd6k\" (UID: \"e44f2fd2-82cf-4057-a3f1-eed70204d566\") " pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.675286 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e44f2fd2-82cf-4057-a3f1-eed70204d566-catalog-content\") pod \"redhat-operators-2pd6k\" (UID: \"e44f2fd2-82cf-4057-a3f1-eed70204d566\") " pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.675848 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e44f2fd2-82cf-4057-a3f1-eed70204d566-catalog-content\") pod \"redhat-operators-2pd6k\" (UID: \"e44f2fd2-82cf-4057-a3f1-eed70204d566\") " pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.676024 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e44f2fd2-82cf-4057-a3f1-eed70204d566-utilities\") pod \"redhat-operators-2pd6k\" (UID: \"e44f2fd2-82cf-4057-a3f1-eed70204d566\") " pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.694953 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjqz\" (UniqueName: \"kubernetes.io/projected/e44f2fd2-82cf-4057-a3f1-eed70204d566-kube-api-access-tsjqz\") pod \"redhat-operators-2pd6k\" (UID: \"e44f2fd2-82cf-4057-a3f1-eed70204d566\") " pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:38 crc kubenswrapper[4816]: I0316 00:43:38.788594 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:39 crc kubenswrapper[4816]: I0316 00:43:39.039891 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pd6k"] Mar 16 00:43:40 crc kubenswrapper[4816]: I0316 00:43:40.057774 4816 generic.go:334] "Generic (PLEG): container finished" podID="e44f2fd2-82cf-4057-a3f1-eed70204d566" containerID="dad72e21e35c85c435e2a625e19c03a90eff429e9e02e50c1401ac4df1544f54" exitCode=0 Mar 16 00:43:40 crc kubenswrapper[4816]: I0316 00:43:40.057864 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pd6k" event={"ID":"e44f2fd2-82cf-4057-a3f1-eed70204d566","Type":"ContainerDied","Data":"dad72e21e35c85c435e2a625e19c03a90eff429e9e02e50c1401ac4df1544f54"} Mar 16 00:43:40 crc kubenswrapper[4816]: I0316 00:43:40.058094 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pd6k" event={"ID":"e44f2fd2-82cf-4057-a3f1-eed70204d566","Type":"ContainerStarted","Data":"fe2a0e1ee1b145dc997d3d25cd5f337e84797bb4b7fe0589e0ddcc55df38a078"} Mar 16 00:43:41 crc kubenswrapper[4816]: I0316 00:43:41.066969 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pd6k" event={"ID":"e44f2fd2-82cf-4057-a3f1-eed70204d566","Type":"ContainerStarted","Data":"984f94f196233fcba0e92f689e75e67fa1b551c74798931a19c0e708d4f263e6"} Mar 16 00:43:42 crc kubenswrapper[4816]: I0316 00:43:42.078122 4816 generic.go:334] "Generic (PLEG): container finished" podID="e44f2fd2-82cf-4057-a3f1-eed70204d566" containerID="984f94f196233fcba0e92f689e75e67fa1b551c74798931a19c0e708d4f263e6" exitCode=0 Mar 16 00:43:42 crc kubenswrapper[4816]: I0316 00:43:42.078320 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pd6k" event={"ID":"e44f2fd2-82cf-4057-a3f1-eed70204d566","Type":"ContainerDied","Data":"984f94f196233fcba0e92f689e75e67fa1b551c74798931a19c0e708d4f263e6"} Mar 16 00:43:43 crc kubenswrapper[4816]: I0316 00:43:43.087334 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pd6k" event={"ID":"e44f2fd2-82cf-4057-a3f1-eed70204d566","Type":"ContainerStarted","Data":"b0c5b609a56392bfcab961f1c2659c984250a293d49725384f8a43353b91d377"} Mar 16 00:43:43 crc kubenswrapper[4816]: I0316 00:43:43.105780 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2pd6k" podStartSLOduration=2.518540051 podStartE2EDuration="5.105762721s" podCreationTimestamp="2026-03-16 00:43:38 +0000 UTC" firstStartedPulling="2026-03-16 00:43:40.05974712 +0000 UTC m=+2213.156047073" lastFinishedPulling="2026-03-16 00:43:42.64696978 +0000 UTC m=+2215.743269743" observedRunningTime="2026-03-16 00:43:43.102372676 +0000 UTC m=+2216.198672649" watchObservedRunningTime="2026-03-16 00:43:43.105762721 +0000 UTC m=+2216.202062674" Mar 16 00:43:48 crc kubenswrapper[4816]: I0316 00:43:48.790446 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:48 crc kubenswrapper[4816]: I0316 00:43:48.791039 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2pd6k" Mar 16 00:43:49 crc kubenswrapper[4816]: I0316 00:43:49.830858 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2pd6k" podUID="e44f2fd2-82cf-4057-a3f1-eed70204d566" containerName="registry-server" probeResult="failure" output=< Mar 16 00:43:49 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:43:49 crc kubenswrapper[4816]: > var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515155651127024455 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015155651130017364 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015155644247016522 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015155644247015472 5ustar corecore